Dec 08 08:58:42 crc systemd[1]: Starting Kubernetes Kubelet... Dec 08 08:58:42 crc restorecon[4698]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:42 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 08:58:43 crc restorecon[4698]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 08:58:43 crc restorecon[4698]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 08 08:58:43 crc kubenswrapper[4776]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 08 08:58:43 crc kubenswrapper[4776]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 08 08:58:43 crc kubenswrapper[4776]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 08 08:58:43 crc kubenswrapper[4776]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 08 08:58:43 crc kubenswrapper[4776]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 08 08:58:43 crc kubenswrapper[4776]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.845186 4776 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849676 4776 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849706 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849712 4776 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849717 4776 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849722 4776 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849728 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849733 4776 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849738 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849743 4776 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849747 4776 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849751 4776 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849754 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849759 4776 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849763 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849767 4776 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849771 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849775 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849779 4776 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849785 4776 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849789 4776 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849793 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849798 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849802 4776 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849806 4776 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849809 4776 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849813 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849818 4776 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849822 4776 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849826 4776 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849829 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849834 4776 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849840 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849845 4776 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849850 4776 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849854 4776 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849859 4776 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849863 4776 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849867 4776 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849871 4776 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849875 4776 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849879 4776 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849882 4776 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849885 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849889 4776 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849893 4776 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849896 4776 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849900 4776 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849903 4776 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849907 4776 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849910 4776 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849914 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849918 4776 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849923 4776 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849927 4776 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849932 4776 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849936 4776 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849941 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849946 4776 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849951 4776 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849955 4776 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849961 4776 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849965 4776 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849971 4776 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849975 4776 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849980 4776 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849983 4776 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849987 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849992 4776 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.849996 4776 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.850000 4776 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.850004 4776 feature_gate.go:330] unrecognized feature gate: Example Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850090 4776 flags.go:64] FLAG: --address="0.0.0.0" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850100 4776 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850108 4776 flags.go:64] FLAG: --anonymous-auth="true" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850114 4776 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850120 4776 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850125 4776 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850130 4776 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850137 4776 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850142 4776 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850146 4776 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850151 4776 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850155 4776 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850160 4776 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850164 4776 flags.go:64] FLAG: --cgroup-root="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850187 4776 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850194 4776 flags.go:64] FLAG: --client-ca-file="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850198 4776 flags.go:64] FLAG: --cloud-config="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850202 4776 flags.go:64] FLAG: --cloud-provider="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850207 4776 flags.go:64] FLAG: --cluster-dns="[]" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850212 4776 flags.go:64] FLAG: --cluster-domain="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850216 4776 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850220 4776 flags.go:64] FLAG: --config-dir="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850224 4776 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850229 4776 flags.go:64] FLAG: --container-log-max-files="5" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850235 4776 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850240 4776 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850244 4776 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850249 4776 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850253 4776 flags.go:64] FLAG: --contention-profiling="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850257 4776 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850261 4776 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850266 4776 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850271 4776 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850278 4776 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850282 4776 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850287 4776 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850291 4776 flags.go:64] FLAG: --enable-load-reader="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850295 4776 flags.go:64] FLAG: --enable-server="true" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850299 4776 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850306 4776 flags.go:64] FLAG: --event-burst="100" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850311 4776 flags.go:64] FLAG: --event-qps="50" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850315 4776 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850319 4776 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850323 4776 flags.go:64] FLAG: --eviction-hard="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850329 4776 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850334 4776 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850339 4776 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850344 4776 flags.go:64] FLAG: --eviction-soft="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850348 4776 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850352 4776 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850356 4776 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850361 4776 flags.go:64] FLAG: --experimental-mounter-path="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850365 4776 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850369 4776 flags.go:64] FLAG: --fail-swap-on="true" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850373 4776 flags.go:64] FLAG: --feature-gates="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850378 4776 flags.go:64] FLAG: --file-check-frequency="20s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850382 4776 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850387 4776 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850392 4776 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850398 4776 flags.go:64] FLAG: --healthz-port="10248" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850403 4776 flags.go:64] FLAG: --help="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850408 4776 flags.go:64] FLAG: --hostname-override="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850413 4776 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850417 4776 flags.go:64] FLAG: --http-check-frequency="20s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850423 4776 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850430 4776 flags.go:64] FLAG: --image-credential-provider-config="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850435 4776 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850442 4776 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850446 4776 flags.go:64] FLAG: --image-service-endpoint="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850451 4776 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850455 4776 flags.go:64] FLAG: --kube-api-burst="100" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850460 4776 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850465 4776 flags.go:64] FLAG: --kube-api-qps="50" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850470 4776 flags.go:64] FLAG: --kube-reserved="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850475 4776 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850480 4776 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850486 4776 flags.go:64] FLAG: --kubelet-cgroups="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850491 4776 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850496 4776 flags.go:64] FLAG: --lock-file="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850500 4776 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850504 4776 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850509 4776 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850516 4776 flags.go:64] FLAG: --log-json-split-stream="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850522 4776 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850526 4776 flags.go:64] FLAG: --log-text-split-stream="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850530 4776 flags.go:64] FLAG: --logging-format="text" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850534 4776 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850539 4776 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850543 4776 flags.go:64] FLAG: --manifest-url="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850547 4776 flags.go:64] FLAG: --manifest-url-header="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850554 4776 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850558 4776 flags.go:64] FLAG: --max-open-files="1000000" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850564 4776 flags.go:64] FLAG: --max-pods="110" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850568 4776 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850573 4776 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850577 4776 flags.go:64] FLAG: --memory-manager-policy="None" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850581 4776 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850586 4776 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850590 4776 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850594 4776 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850604 4776 flags.go:64] FLAG: --node-status-max-images="50" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850608 4776 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850613 4776 flags.go:64] FLAG: --oom-score-adj="-999" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850617 4776 flags.go:64] FLAG: --pod-cidr="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850621 4776 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850628 4776 flags.go:64] FLAG: --pod-manifest-path="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850632 4776 flags.go:64] FLAG: --pod-max-pids="-1" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850636 4776 flags.go:64] FLAG: --pods-per-core="0" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850640 4776 flags.go:64] FLAG: --port="10250" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850644 4776 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850648 4776 flags.go:64] FLAG: --provider-id="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850652 4776 flags.go:64] FLAG: --qos-reserved="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850656 4776 flags.go:64] FLAG: --read-only-port="10255" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850661 4776 flags.go:64] FLAG: --register-node="true" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850665 4776 flags.go:64] FLAG: --register-schedulable="true" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850669 4776 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850677 4776 flags.go:64] FLAG: --registry-burst="10" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850681 4776 flags.go:64] FLAG: --registry-qps="5" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850685 4776 flags.go:64] FLAG: --reserved-cpus="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850690 4776 flags.go:64] FLAG: --reserved-memory="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850695 4776 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850700 4776 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850705 4776 flags.go:64] FLAG: --rotate-certificates="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850710 4776 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850716 4776 flags.go:64] FLAG: --runonce="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850721 4776 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850725 4776 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850729 4776 flags.go:64] FLAG: --seccomp-default="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850733 4776 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850738 4776 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850746 4776 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850756 4776 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850761 4776 flags.go:64] FLAG: --storage-driver-password="root" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850767 4776 flags.go:64] FLAG: --storage-driver-secure="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850772 4776 flags.go:64] FLAG: --storage-driver-table="stats" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850777 4776 flags.go:64] FLAG: --storage-driver-user="root" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850782 4776 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850787 4776 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850792 4776 flags.go:64] FLAG: --system-cgroups="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850797 4776 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850806 4776 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850811 4776 flags.go:64] FLAG: --tls-cert-file="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850816 4776 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850823 4776 flags.go:64] FLAG: --tls-min-version="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850827 4776 flags.go:64] FLAG: --tls-private-key-file="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850831 4776 flags.go:64] FLAG: --topology-manager-policy="none" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850835 4776 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850840 4776 flags.go:64] FLAG: --topology-manager-scope="container" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850844 4776 flags.go:64] FLAG: --v="2" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850852 4776 flags.go:64] FLAG: --version="false" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850858 4776 flags.go:64] FLAG: --vmodule="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850864 4776 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.850868 4776 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.850975 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.850981 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.850986 4776 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.850990 4776 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.850994 4776 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.850999 4776 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851004 4776 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851008 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851014 4776 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851018 4776 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851022 4776 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851026 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851029 4776 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851033 4776 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851037 4776 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851040 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851043 4776 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851047 4776 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851050 4776 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851055 4776 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851060 4776 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851063 4776 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851067 4776 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851071 4776 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851074 4776 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851078 4776 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851082 4776 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851085 4776 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851089 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851092 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851096 4776 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851099 4776 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851103 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851106 4776 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851109 4776 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851113 4776 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851116 4776 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851120 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851124 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851127 4776 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851131 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851144 4776 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851149 4776 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851153 4776 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851157 4776 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851161 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851165 4776 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851186 4776 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851191 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851195 4776 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851198 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851202 4776 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851205 4776 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851209 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851212 4776 feature_gate.go:330] unrecognized feature gate: Example Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851216 4776 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851219 4776 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851223 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851226 4776 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851230 4776 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851233 4776 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851237 4776 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851240 4776 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851244 4776 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851247 4776 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851251 4776 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851254 4776 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851258 4776 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851262 4776 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851265 4776 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.851269 4776 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.851283 4776 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.859724 4776 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.859771 4776 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859844 4776 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859857 4776 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859861 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859867 4776 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859872 4776 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859876 4776 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859880 4776 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859883 4776 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859887 4776 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859890 4776 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859894 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859900 4776 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859906 4776 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859910 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859913 4776 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859917 4776 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859921 4776 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859925 4776 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859929 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859933 4776 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859936 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859940 4776 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859944 4776 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859949 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859954 4776 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859958 4776 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859963 4776 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859967 4776 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859972 4776 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859977 4776 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859980 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859985 4776 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859989 4776 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.859994 4776 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860000 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860004 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860008 4776 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860012 4776 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860017 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860021 4776 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860031 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860038 4776 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860043 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860048 4776 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860053 4776 feature_gate.go:330] unrecognized feature gate: Example Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860057 4776 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860060 4776 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860064 4776 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860068 4776 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860071 4776 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860075 4776 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860078 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860082 4776 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860086 4776 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860091 4776 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860096 4776 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860101 4776 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860105 4776 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860109 4776 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860113 4776 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860116 4776 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860119 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860124 4776 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860129 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860134 4776 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860137 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860141 4776 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860145 4776 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860149 4776 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860153 4776 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860158 4776 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.860166 4776 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860309 4776 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860317 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860321 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860327 4776 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860331 4776 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860335 4776 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860338 4776 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860342 4776 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860347 4776 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860351 4776 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860355 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860359 4776 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860363 4776 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860367 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860371 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860376 4776 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860380 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860385 4776 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860389 4776 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860392 4776 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860396 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860399 4776 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860403 4776 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860406 4776 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860410 4776 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860413 4776 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860417 4776 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860420 4776 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860425 4776 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860429 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860433 4776 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860437 4776 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860440 4776 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860444 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860448 4776 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860452 4776 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860455 4776 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860459 4776 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860462 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860466 4776 feature_gate.go:330] unrecognized feature gate: Example Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860470 4776 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860473 4776 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860477 4776 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860480 4776 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860492 4776 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860496 4776 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860500 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860506 4776 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860511 4776 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860516 4776 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860522 4776 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860526 4776 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860530 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860534 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860538 4776 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860542 4776 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860546 4776 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860551 4776 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860560 4776 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860568 4776 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860574 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860579 4776 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860586 4776 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860591 4776 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860597 4776 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860602 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860608 4776 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860614 4776 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860619 4776 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860625 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.860642 4776 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.860651 4776 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.861019 4776 server.go:940] "Client rotation is on, will bootstrap in background" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.863791 4776 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.863889 4776 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.864353 4776 server.go:997] "Starting client certificate rotation" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.864376 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.864534 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-05 08:37:23.231730716 +0000 UTC Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.864608 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.868136 4776 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.869509 4776 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 08 08:58:43 crc kubenswrapper[4776]: E1208 08:58:43.870692 4776 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.876674 4776 log.go:25] "Validated CRI v1 runtime API" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.890644 4776 log.go:25] "Validated CRI v1 image API" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.891871 4776 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.893719 4776 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-08-08-54-24-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.893757 4776 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.909696 4776 manager.go:217] Machine: {Timestamp:2025-12-08 08:58:43.908622191 +0000 UTC m=+0.171847243 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:c2909369-742b-49a0-ae37-af59748afd08 BootID:2ebf5967-b40e-4612-8f34-c965ce3a7e5b Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:9c:1e:1d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:9c:1e:1d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a4:69:7c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d7:7f:51 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:d7:44:73 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:15:06:b1 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:a6:bf:44:59:30:ce Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3a:73:cb:c5:28:7b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.909916 4776 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.910071 4776 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.910382 4776 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.910536 4776 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.910572 4776 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.910756 4776 topology_manager.go:138] "Creating topology manager with none policy" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.910765 4776 container_manager_linux.go:303] "Creating device plugin manager" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.910984 4776 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.911015 4776 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.911530 4776 state_mem.go:36] "Initialized new in-memory state store" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.911608 4776 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.912123 4776 kubelet.go:418] "Attempting to sync node with API server" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.912144 4776 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.912185 4776 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.912202 4776 kubelet.go:324] "Adding apiserver pod source" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.912216 4776 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.913624 4776 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.913689 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Dec 08 08:58:43 crc kubenswrapper[4776]: E1208 08:58:43.913764 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.913947 4776 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.914122 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Dec 08 08:58:43 crc kubenswrapper[4776]: E1208 08:58:43.914185 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.914769 4776 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.915257 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.915279 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.915288 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.915295 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.915305 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.915312 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.915319 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.915329 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.915382 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.915392 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.915415 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.915422 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.915563 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.916115 4776 server.go:1280] "Started kubelet" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.916441 4776 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.916878 4776 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.917779 4776 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.918345 4776 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 08 08:58:43 crc systemd[1]: Started Kubernetes Kubelet. Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.920704 4776 server.go:460] "Adding debug handlers to kubelet server" Dec 08 08:58:43 crc kubenswrapper[4776]: E1208 08:58:43.919367 4776 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.82:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187f31ca4e1b18d5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-08 08:58:43.916077269 +0000 UTC m=+0.179302281,LastTimestamp:2025-12-08 08:58:43.916077269 +0000 UTC m=+0.179302281,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.922537 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.922639 4776 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.922650 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 19:40:52.342755204 +0000 UTC Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.923403 4776 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.923482 4776 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 08 08:58:43 crc kubenswrapper[4776]: E1208 08:58:43.924469 4776 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 08 08:58:43 crc kubenswrapper[4776]: W1208 08:58:43.925418 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.925519 4776 factory.go:55] Registering systemd factory Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.925536 4776 factory.go:221] Registration of the systemd container factory successfully Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.925558 4776 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 08 08:58:43 crc kubenswrapper[4776]: E1208 08:58:43.925519 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Dec 08 08:58:43 crc kubenswrapper[4776]: E1208 08:58:43.926341 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="200ms" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.926600 4776 factory.go:153] Registering CRI-O factory Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.926625 4776 factory.go:221] Registration of the crio container factory successfully Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.926740 4776 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.927077 4776 factory.go:103] Registering Raw factory Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.927099 4776 manager.go:1196] Started watching for new ooms in manager Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.932123 4776 manager.go:319] Starting recovery of all containers Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937164 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937331 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937342 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937355 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937371 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937379 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937389 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937400 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937411 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937421 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937433 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937463 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937473 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937487 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937496 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937506 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937537 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937548 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937557 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937567 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937576 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937605 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937616 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937626 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937635 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937644 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937656 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937667 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937676 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937704 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937714 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937755 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937766 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937774 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937782 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937792 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937800 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937810 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937821 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937831 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937860 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937872 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937882 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937892 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937902 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937913 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937924 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937934 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937945 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937956 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937966 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937976 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.937993 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938004 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938014 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938025 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938037 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938047 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938056 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938066 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938076 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938088 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938100 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938111 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938124 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938135 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938146 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938157 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938183 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938198 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938211 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938228 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938242 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938255 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938265 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938275 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938286 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938296 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938307 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938317 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938328 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938338 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938348 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938357 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938367 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938377 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938387 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938397 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938408 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938418 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938427 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938437 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938447 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938458 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938469 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938481 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938501 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938512 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938522 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938533 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938543 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938557 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938567 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938577 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938592 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938603 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938616 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938627 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938637 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938648 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938659 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938670 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938681 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938693 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938704 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938715 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938726 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938737 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938749 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938759 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938769 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938779 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938789 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938800 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938812 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938823 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938833 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938843 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938854 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938865 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938875 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938885 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938895 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938910 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938921 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938932 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938946 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938957 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938968 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938979 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.938990 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939002 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939012 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939026 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939039 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939050 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939060 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939070 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939081 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939091 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939102 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939112 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939124 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939136 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939152 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939163 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939192 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939208 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.939221 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940261 4776 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940291 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940309 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940324 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940340 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940354 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940368 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940384 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940401 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940417 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940432 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940446 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940459 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940472 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940485 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940499 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940512 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940527 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940540 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940554 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940567 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940580 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940593 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940606 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940619 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940632 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940645 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940658 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940672 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940687 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940700 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940714 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940730 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940754 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940772 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940788 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940809 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940824 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940840 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940858 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940874 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940889 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940904 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940919 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940933 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940946 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940961 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940975 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.940990 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.941004 4776 reconstruct.go:97] "Volume reconstruction finished" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.941013 4776 reconciler.go:26] "Reconciler: start to sync state" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.959771 4776 manager.go:324] Recovery completed Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.969958 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.972491 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.972542 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.972553 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.974621 4776 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.974644 4776 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 08 08:58:43 crc kubenswrapper[4776]: I1208 08:58:43.974666 4776 state_mem.go:36] "Initialized new in-memory state store" Dec 08 08:58:44 crc kubenswrapper[4776]: E1208 08:58:44.025387 4776 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 08 08:58:44 crc kubenswrapper[4776]: E1208 08:58:44.126422 4776 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 08 08:58:44 crc kubenswrapper[4776]: E1208 08:58:44.127201 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="400ms" Dec 08 08:58:44 crc kubenswrapper[4776]: E1208 08:58:44.226837 4776 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 08 08:58:44 crc kubenswrapper[4776]: E1208 08:58:44.327855 4776 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.338019 4776 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.342071 4776 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.342242 4776 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.342361 4776 kubelet.go:2335] "Starting kubelet main sync loop" Dec 08 08:58:44 crc kubenswrapper[4776]: E1208 08:58:44.342475 4776 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.342690 4776 policy_none.go:49] "None policy: Start" Dec 08 08:58:44 crc kubenswrapper[4776]: W1208 08:58:44.344330 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Dec 08 08:58:44 crc kubenswrapper[4776]: E1208 08:58:44.344410 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.344527 4776 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.344593 4776 state_mem.go:35] "Initializing new in-memory state store" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.416786 4776 manager.go:334] "Starting Device Plugin manager" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.416859 4776 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.416879 4776 server.go:79] "Starting device plugin registration server" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.417550 4776 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.417596 4776 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.418003 4776 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.418122 4776 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.418135 4776 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 08 08:58:44 crc kubenswrapper[4776]: E1208 08:58:44.433696 4776 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.443641 4776 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.443712 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.444709 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.444763 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.444783 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.444960 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.445539 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.445735 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.446254 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.446280 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.446289 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.446376 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.446551 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.446609 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.447532 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.447547 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.447554 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.447929 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.448078 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.448232 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.447947 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.448527 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.448545 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.448912 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.449051 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.449110 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.450319 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.450357 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.450403 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.450323 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.450736 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.450898 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.451218 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.451540 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.451671 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.452509 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.452535 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.452543 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.452695 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.452717 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.453535 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.453553 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.453555 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.453587 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.453605 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.453564 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.517733 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.519406 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.519436 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.519445 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.519467 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 08 08:58:44 crc kubenswrapper[4776]: E1208 08:58:44.519932 4776 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Dec 08 08:58:44 crc kubenswrapper[4776]: E1208 08:58:44.528363 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="800ms" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.547555 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.547617 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.547656 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.547676 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.547697 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.547733 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.547754 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.547769 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.547800 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.547843 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.547856 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.547871 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.547885 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.547932 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.547946 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.648843 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.648902 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.648934 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649008 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649049 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649067 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649009 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649094 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649155 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649217 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649260 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649294 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649317 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649326 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649339 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649361 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649380 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649392 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649408 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649413 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649431 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649434 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649461 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649467 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649476 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649541 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649574 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649603 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649631 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.649660 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.720038 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.721900 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.721957 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.721970 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.722002 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 08 08:58:44 crc kubenswrapper[4776]: E1208 08:58:44.722529 4776 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.781874 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: W1208 08:58:44.805852 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-22de763de2b3535f9f850fa60484b4dbc76aac0603974c150d5d74885bd2259f WatchSource:0}: Error finding container 22de763de2b3535f9f850fa60484b4dbc76aac0603974c150d5d74885bd2259f: Status 404 returned error can't find the container with id 22de763de2b3535f9f850fa60484b4dbc76aac0603974c150d5d74885bd2259f Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.811062 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.822821 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: W1208 08:58:44.831499 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-23aa9cb1b6a51d2fb59877985a0375b03aef12d11eb8fb5b937042328ec20605 WatchSource:0}: Error finding container 23aa9cb1b6a51d2fb59877985a0375b03aef12d11eb8fb5b937042328ec20605: Status 404 returned error can't find the container with id 23aa9cb1b6a51d2fb59877985a0375b03aef12d11eb8fb5b937042328ec20605 Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.843647 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: W1208 08:58:44.847416 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-b13e8b334405f9e0aba5e62476f61a5d1677ea3d2fe208009cb9277df92e8970 WatchSource:0}: Error finding container b13e8b334405f9e0aba5e62476f61a5d1677ea3d2fe208009cb9277df92e8970: Status 404 returned error can't find the container with id b13e8b334405f9e0aba5e62476f61a5d1677ea3d2fe208009cb9277df92e8970 Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.853413 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:58:44 crc kubenswrapper[4776]: W1208 08:58:44.867631 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-3e3409b970d68b0a7be4ec96f37c6453efee86b9b5d8b4dd18eb8d4f7aebab8b WatchSource:0}: Error finding container 3e3409b970d68b0a7be4ec96f37c6453efee86b9b5d8b4dd18eb8d4f7aebab8b: Status 404 returned error can't find the container with id 3e3409b970d68b0a7be4ec96f37c6453efee86b9b5d8b4dd18eb8d4f7aebab8b Dec 08 08:58:44 crc kubenswrapper[4776]: W1208 08:58:44.876872 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-e9bc01119aa86b7c6cfb1991aa8bc4043cceb3085949504d6d3099c835e24874 WatchSource:0}: Error finding container e9bc01119aa86b7c6cfb1991aa8bc4043cceb3085949504d6d3099c835e24874: Status 404 returned error can't find the container with id e9bc01119aa86b7c6cfb1991aa8bc4043cceb3085949504d6d3099c835e24874 Dec 08 08:58:44 crc kubenswrapper[4776]: W1208 08:58:44.899221 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Dec 08 08:58:44 crc kubenswrapper[4776]: E1208 08:58:44.899303 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.918709 4776 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.923866 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 23:16:44.831168189 +0000 UTC Dec 08 08:58:44 crc kubenswrapper[4776]: I1208 08:58:44.923922 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 878h17m59.907248958s for next certificate rotation Dec 08 08:58:44 crc kubenswrapper[4776]: W1208 08:58:44.996737 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Dec 08 08:58:44 crc kubenswrapper[4776]: E1208 08:58:44.996874 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.123684 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.125686 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.125734 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.125747 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.125774 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 08 08:58:45 crc kubenswrapper[4776]: E1208 08:58:45.126261 4776 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Dec 08 08:58:45 crc kubenswrapper[4776]: E1208 08:58:45.329625 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="1.6s" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.348128 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b" exitCode=0 Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.348207 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b"} Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.348279 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e9bc01119aa86b7c6cfb1991aa8bc4043cceb3085949504d6d3099c835e24874"} Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.348367 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.349579 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.349766 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.349785 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.350978 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5"} Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.351014 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3e3409b970d68b0a7be4ec96f37c6453efee86b9b5d8b4dd18eb8d4f7aebab8b"} Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.354444 4776 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4" exitCode=0 Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.354506 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4"} Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.354522 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b13e8b334405f9e0aba5e62476f61a5d1677ea3d2fe208009cb9277df92e8970"} Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.354605 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.355420 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.355447 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.355455 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.355696 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.356741 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.356772 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.356782 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.357400 4776 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="53120dd7e9433a30e26bce760e1424a89d586cebf2f627af5885d8e43f9c731b" exitCode=0 Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.357472 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"53120dd7e9433a30e26bce760e1424a89d586cebf2f627af5885d8e43f9c731b"} Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.357499 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"23aa9cb1b6a51d2fb59877985a0375b03aef12d11eb8fb5b937042328ec20605"} Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.357598 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.359060 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.359080 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.359091 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.360764 4776 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b" exitCode=0 Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.360808 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b"} Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.360829 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"22de763de2b3535f9f850fa60484b4dbc76aac0603974c150d5d74885bd2259f"} Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.360904 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.363731 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.363755 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.363768 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:45 crc kubenswrapper[4776]: W1208 08:58:45.364200 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Dec 08 08:58:45 crc kubenswrapper[4776]: E1208 08:58:45.364283 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Dec 08 08:58:45 crc kubenswrapper[4776]: W1208 08:58:45.814009 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Dec 08 08:58:45 crc kubenswrapper[4776]: E1208 08:58:45.814107 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.918940 4776 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.926575 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.927867 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.927911 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.927924 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:45 crc kubenswrapper[4776]: I1208 08:58:45.927958 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 08 08:58:45 crc kubenswrapper[4776]: E1208 08:58:45.928571 4776 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.071827 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.363976 4776 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670" exitCode=0 Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.364069 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670"} Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.364304 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.366084 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.366120 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.366135 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.369452 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f9e6618291bd02472481cb1d5469287732dd869be2767bd9209c9f5b846b6b2c"} Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.369550 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.370381 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.370405 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.370414 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.384462 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"146374df9edb9e0092cf2e4cac4a5955d7d0980be93df8188f4b55ad12901572"} Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.384532 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3c567d34bcaecb124f79504fee8f22c148f78bb039741a7b52883ab3188edaa4"} Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.384547 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5d32a10a86fe749d233a68a8e7583294e21c634dc47febe04e56220b591d505e"} Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.384580 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.386049 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.386089 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.386105 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.394602 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3"} Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.394661 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe"} Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.394680 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608"} Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.394696 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144"} Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.397929 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2"} Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.397993 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460"} Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.398008 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743"} Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.398126 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.399325 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.399360 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:46 crc kubenswrapper[4776]: I1208 08:58:46.399374 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.405770 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008"} Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.405891 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.407699 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.407812 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.407841 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.409235 4776 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388" exitCode=0 Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.409339 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.409365 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.409354 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388"} Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.409512 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.409715 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.410428 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.410468 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.410481 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.411224 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.411271 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.411282 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.411305 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.411350 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.411563 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.473703 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.482000 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.528968 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.530801 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.530892 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.530924 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:47 crc kubenswrapper[4776]: I1208 08:58:47.530981 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 08 08:58:48 crc kubenswrapper[4776]: I1208 08:58:48.416955 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339"} Dec 08 08:58:48 crc kubenswrapper[4776]: I1208 08:58:48.417031 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 08:58:48 crc kubenswrapper[4776]: I1208 08:58:48.416989 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:48 crc kubenswrapper[4776]: I1208 08:58:48.417058 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7"} Dec 08 08:58:48 crc kubenswrapper[4776]: I1208 08:58:48.417234 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:48 crc kubenswrapper[4776]: I1208 08:58:48.417242 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9"} Dec 08 08:58:48 crc kubenswrapper[4776]: I1208 08:58:48.417284 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe"} Dec 08 08:58:48 crc kubenswrapper[4776]: I1208 08:58:48.417333 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:58:48 crc kubenswrapper[4776]: I1208 08:58:48.418867 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:48 crc kubenswrapper[4776]: I1208 08:58:48.418905 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:48 crc kubenswrapper[4776]: I1208 08:58:48.418919 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:48 crc kubenswrapper[4776]: I1208 08:58:48.420401 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:48 crc kubenswrapper[4776]: I1208 08:58:48.420438 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:48 crc kubenswrapper[4776]: I1208 08:58:48.420450 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:49 crc kubenswrapper[4776]: I1208 08:58:49.424955 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e"} Dec 08 08:58:49 crc kubenswrapper[4776]: I1208 08:58:49.425004 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:49 crc kubenswrapper[4776]: I1208 08:58:49.425240 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:49 crc kubenswrapper[4776]: I1208 08:58:49.425641 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:49 crc kubenswrapper[4776]: I1208 08:58:49.426877 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:49 crc kubenswrapper[4776]: I1208 08:58:49.426916 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:49 crc kubenswrapper[4776]: I1208 08:58:49.426926 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:49 crc kubenswrapper[4776]: I1208 08:58:49.426997 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:49 crc kubenswrapper[4776]: I1208 08:58:49.427098 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:49 crc kubenswrapper[4776]: I1208 08:58:49.427119 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:49 crc kubenswrapper[4776]: I1208 08:58:49.427698 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:49 crc kubenswrapper[4776]: I1208 08:58:49.427753 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:49 crc kubenswrapper[4776]: I1208 08:58:49.427767 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:49 crc kubenswrapper[4776]: I1208 08:58:49.574991 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:58:50 crc kubenswrapper[4776]: I1208 08:58:50.428094 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:50 crc kubenswrapper[4776]: I1208 08:58:50.428323 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:50 crc kubenswrapper[4776]: I1208 08:58:50.430029 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:50 crc kubenswrapper[4776]: I1208 08:58:50.430053 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:50 crc kubenswrapper[4776]: I1208 08:58:50.430091 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:50 crc kubenswrapper[4776]: I1208 08:58:50.430104 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:50 crc kubenswrapper[4776]: I1208 08:58:50.430115 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:50 crc kubenswrapper[4776]: I1208 08:58:50.430129 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:51 crc kubenswrapper[4776]: I1208 08:58:51.039866 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 08 08:58:51 crc kubenswrapper[4776]: I1208 08:58:51.431389 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:51 crc kubenswrapper[4776]: I1208 08:58:51.433506 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:51 crc kubenswrapper[4776]: I1208 08:58:51.433556 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:51 crc kubenswrapper[4776]: I1208 08:58:51.433568 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:51 crc kubenswrapper[4776]: I1208 08:58:51.456882 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 08:58:51 crc kubenswrapper[4776]: I1208 08:58:51.457255 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:51 crc kubenswrapper[4776]: I1208 08:58:51.459069 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:51 crc kubenswrapper[4776]: I1208 08:58:51.459126 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:51 crc kubenswrapper[4776]: I1208 08:58:51.459142 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:51 crc kubenswrapper[4776]: I1208 08:58:51.760249 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:58:51 crc kubenswrapper[4776]: I1208 08:58:51.760654 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:51 crc kubenswrapper[4776]: I1208 08:58:51.762650 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:51 crc kubenswrapper[4776]: I1208 08:58:51.762715 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:51 crc kubenswrapper[4776]: I1208 08:58:51.762732 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:52 crc kubenswrapper[4776]: I1208 08:58:52.820104 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 08:58:52 crc kubenswrapper[4776]: I1208 08:58:52.820525 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:52 crc kubenswrapper[4776]: I1208 08:58:52.822407 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:52 crc kubenswrapper[4776]: I1208 08:58:52.822483 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:52 crc kubenswrapper[4776]: I1208 08:58:52.822505 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:54 crc kubenswrapper[4776]: E1208 08:58:54.434144 4776 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 08 08:58:54 crc kubenswrapper[4776]: I1208 08:58:54.449935 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 08:58:54 crc kubenswrapper[4776]: I1208 08:58:54.450571 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:54 crc kubenswrapper[4776]: I1208 08:58:54.452320 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:54 crc kubenswrapper[4776]: I1208 08:58:54.452388 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:54 crc kubenswrapper[4776]: I1208 08:58:54.452414 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:55 crc kubenswrapper[4776]: I1208 08:58:55.278932 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 08:58:55 crc kubenswrapper[4776]: I1208 08:58:55.444604 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:55 crc kubenswrapper[4776]: I1208 08:58:55.445632 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:55 crc kubenswrapper[4776]: I1208 08:58:55.445685 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:55 crc kubenswrapper[4776]: I1208 08:58:55.445702 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:56 crc kubenswrapper[4776]: E1208 08:58:56.073513 4776 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 08 08:58:56 crc kubenswrapper[4776]: I1208 08:58:56.475316 4776 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 08 08:58:56 crc kubenswrapper[4776]: I1208 08:58:56.475371 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 08 08:58:56 crc kubenswrapper[4776]: I1208 08:58:56.703303 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 08 08:58:56 crc kubenswrapper[4776]: I1208 08:58:56.703502 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:56 crc kubenswrapper[4776]: I1208 08:58:56.704519 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:56 crc kubenswrapper[4776]: I1208 08:58:56.704575 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:56 crc kubenswrapper[4776]: I1208 08:58:56.704585 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:56 crc kubenswrapper[4776]: W1208 08:58:56.744577 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 08 08:58:56 crc kubenswrapper[4776]: I1208 08:58:56.744658 4776 trace.go:236] Trace[2026825130]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Dec-2025 08:58:46.742) (total time: 10001ms): Dec 08 08:58:56 crc kubenswrapper[4776]: Trace[2026825130]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (08:58:56.744) Dec 08 08:58:56 crc kubenswrapper[4776]: Trace[2026825130]: [10.001725233s] [10.001725233s] END Dec 08 08:58:56 crc kubenswrapper[4776]: E1208 08:58:56.744680 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 08 08:58:56 crc kubenswrapper[4776]: I1208 08:58:56.919875 4776 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 08 08:58:56 crc kubenswrapper[4776]: E1208 08:58:56.931358 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 08 08:58:56 crc kubenswrapper[4776]: I1208 08:58:56.973351 4776 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 08 08:58:56 crc kubenswrapper[4776]: I1208 08:58:56.973409 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 08 08:58:56 crc kubenswrapper[4776]: I1208 08:58:56.981710 4776 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 08 08:58:56 crc kubenswrapper[4776]: I1208 08:58:56.981771 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 08 08:58:57 crc kubenswrapper[4776]: I1208 08:58:57.450740 4776 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 08 08:58:57 crc kubenswrapper[4776]: I1208 08:58:57.450903 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 08 08:58:59 crc kubenswrapper[4776]: I1208 08:58:59.581088 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:58:59 crc kubenswrapper[4776]: I1208 08:58:59.581406 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:58:59 crc kubenswrapper[4776]: I1208 08:58:59.581895 4776 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 08 08:58:59 crc kubenswrapper[4776]: I1208 08:58:59.581957 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 08 08:58:59 crc kubenswrapper[4776]: I1208 08:58:59.582741 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:58:59 crc kubenswrapper[4776]: I1208 08:58:59.582804 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:58:59 crc kubenswrapper[4776]: I1208 08:58:59.582824 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:58:59 crc kubenswrapper[4776]: I1208 08:58:59.586116 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.077801 4776 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.182251 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.200108 4776 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.456497 4776 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.456811 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.919753 4776 apiserver.go:52] "Watching apiserver" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.922952 4776 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.923462 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.924142 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.924223 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.924411 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.924659 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:00 crc kubenswrapper[4776]: E1208 08:59:00.924750 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:00 crc kubenswrapper[4776]: E1208 08:59:00.924832 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.925064 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.925105 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:00 crc kubenswrapper[4776]: E1208 08:59:00.925385 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.926262 4776 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.926512 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.928366 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.928710 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.928941 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.929113 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.929187 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.929217 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.929316 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.929326 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.960053 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.976366 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.985256 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:00 crc kubenswrapper[4776]: I1208 08:59:00.994727 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.010783 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.024757 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.036369 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.044963 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.056090 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.069221 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.079130 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.091012 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.459482 4776 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.459585 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.972483 4776 trace.go:236] Trace[1868976527]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Dec-2025 08:58:47.546) (total time: 14425ms): Dec 08 08:59:01 crc kubenswrapper[4776]: Trace[1868976527]: ---"Objects listed" error: 14425ms (08:59:01.972) Dec 08 08:59:01 crc kubenswrapper[4776]: Trace[1868976527]: [14.425468947s] [14.425468947s] END Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.972517 4776 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.973568 4776 trace.go:236] Trace[1682222778]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Dec-2025 08:58:48.269) (total time: 13703ms): Dec 08 08:59:01 crc kubenswrapper[4776]: Trace[1682222778]: ---"Objects listed" error: 13703ms (08:59:01.973) Dec 08 08:59:01 crc kubenswrapper[4776]: Trace[1682222778]: [13.703641804s] [13.703641804s] END Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.973602 4776 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.974722 4776 trace.go:236] Trace[1271511925]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Dec-2025 08:58:47.253) (total time: 14721ms): Dec 08 08:59:01 crc kubenswrapper[4776]: Trace[1271511925]: ---"Objects listed" error: 14721ms (08:59:01.974) Dec 08 08:59:01 crc kubenswrapper[4776]: Trace[1271511925]: [14.721535556s] [14.721535556s] END Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.974744 4776 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 08 08:59:01 crc kubenswrapper[4776]: E1208 08:59:01.975024 4776 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 08 08:59:01 crc kubenswrapper[4776]: I1208 08:59:01.976274 4776 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.077549 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.077618 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.077650 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.077679 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.077706 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.077740 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.077766 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.077795 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.077821 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.077844 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.077884 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.077911 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.077935 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.077959 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.077968 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078019 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078055 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078112 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078140 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078147 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078193 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078189 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078201 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078308 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078309 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078362 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078400 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078475 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078508 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078546 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078581 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078616 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078653 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078686 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078730 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078763 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078840 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078880 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078578 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078585 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078606 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078895 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078889 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.079152 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.079216 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.079276 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.079349 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.079506 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.079548 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.079680 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.079708 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.079720 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.079893 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.078914 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.079994 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080029 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080056 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080054 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080138 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.079964 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080305 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080381 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080415 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080440 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080600 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080627 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080774 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080824 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080824 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080857 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080886 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080910 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080936 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080970 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.080999 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081011 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081024 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081051 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081077 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081103 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081125 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081148 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081155 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081190 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081231 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081262 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081356 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081373 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081407 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081424 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081436 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081471 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081502 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081529 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081555 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081580 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081592 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081607 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081636 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081665 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081696 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081725 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081726 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081756 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081783 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081808 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081811 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081833 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081841 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081856 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081883 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081910 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081913 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081949 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.081999 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082011 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082023 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082083 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082110 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082124 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082159 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082217 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082251 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082286 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082321 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082355 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082388 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082438 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082471 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082503 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082536 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082567 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082603 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082641 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082675 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082726 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082760 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082797 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082830 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082865 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082901 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082935 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082968 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.083005 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.083037 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.083068 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.083102 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.083136 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.083170 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.083826 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.083874 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.083927 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.083967 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084005 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084041 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084075 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084113 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084150 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084205 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084245 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084283 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084321 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084353 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084387 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084429 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084467 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084500 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084536 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084576 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084612 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084680 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084714 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084751 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084784 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084820 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084852 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084887 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084929 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.084992 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.085027 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086074 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086126 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086200 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086262 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086299 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086335 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086372 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086407 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086465 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086501 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086535 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086570 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086606 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086648 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086683 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086721 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086758 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086795 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086833 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086868 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086903 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086939 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.087935 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.087986 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.088035 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.088087 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.088130 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.088240 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.088426 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.088461 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.088497 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.088566 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082218 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082282 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.091062 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082633 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082675 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082710 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.082873 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.083158 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.083196 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.083451 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.083860 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086022 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.086968 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.087459 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.087506 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.089066 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.089447 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.089487 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.089693 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.089814 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.089884 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.089950 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.090390 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.091342 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.091588 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.091975 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.092096 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.092320 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.092306 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.092370 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.092375 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.090727 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.090761 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.091009 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.091019 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.092524 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.091024 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.091326 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.091350 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.092534 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.092700 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.092875 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.093124 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.093212 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.093825 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.093837 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.094123 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.094391 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.094717 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.094742 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.094798 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.094839 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.095475 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.095708 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.095795 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.095822 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.096084 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.096118 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.096274 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.096537 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.096786 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.097084 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.097325 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.097535 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.097723 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.097793 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.097807 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.097822 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.098372 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.098435 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.098580 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.098907 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.099311 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.099315 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.099446 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.099590 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.100242 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.100268 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.100436 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.100517 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.100550 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.100995 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.090486 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.101873 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.102126 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.102218 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.102856 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.103060 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.103083 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.103279 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.103433 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.102719 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.103420 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.103634 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.103704 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.103773 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.103859 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.103904 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.103961 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104023 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104055 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.103972 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104136 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104187 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104221 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104254 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104288 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104470 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104500 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104530 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104556 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104587 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104614 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104646 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104671 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104694 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104720 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104744 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104771 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104795 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104900 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104078 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104094 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104165 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104261 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104508 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104565 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.105675 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104863 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.104944 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.105027 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.105233 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.105405 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.105491 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.105504 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.105565 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.105537 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.106060 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.106309 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.106390 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.106898 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.107569 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.107682 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.107760 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.107859 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.107936 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.107985 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.108015 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.108085 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.108150 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.108219 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.107761 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.109075 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.109586 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.109677 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.109726 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.109764 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.109797 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.109968 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.109975 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110038 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110062 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110075 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110085 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110095 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110105 4776 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110115 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110126 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110144 4776 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110161 4776 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110185 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110196 4776 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110206 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110219 4776 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110287 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110301 4776 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110332 4776 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110357 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110371 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110381 4776 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110390 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110401 4776 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110412 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110442 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110455 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110469 4776 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110482 4776 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110494 4776 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110507 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110519 4776 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110533 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110547 4776 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110560 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110572 4776 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110572 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110585 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.110657 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110665 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110682 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110698 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110034 4776 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.110738 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:02.610697167 +0000 UTC m=+18.873922279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110733 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110760 4776 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110776 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110788 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110798 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110800 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110814 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110836 4776 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110850 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.110894 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110862 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110926 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.110949 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.110981 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:02.610955843 +0000 UTC m=+18.874180915 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.111021 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.111044 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.111067 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.111088 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.111164 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.111206 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.111239 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.111286 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.111424 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.111465 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113530 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113573 4776 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113603 4776 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113625 4776 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113648 4776 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113666 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113685 4776 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113703 4776 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113721 4776 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113737 4776 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113754 4776 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113777 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113796 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113817 4776 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113836 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113854 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113874 4776 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113893 4776 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113911 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113928 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113945 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.113964 4776 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.118105 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 08:59:02.618076138 +0000 UTC m=+18.881301170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.118338 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.119700 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120482 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120505 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120521 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120535 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120548 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120561 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120573 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120587 4776 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120599 4776 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120612 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120625 4776 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120638 4776 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120651 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120663 4776 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120675 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120688 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120702 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120714 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120727 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120740 4776 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120753 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120766 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120779 4776 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120791 4776 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120806 4776 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120818 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120831 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120844 4776 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120857 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120869 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120882 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120894 4776 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120906 4776 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120918 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120930 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120943 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120957 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120972 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120986 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.120998 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121010 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121023 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121035 4776 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121048 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121062 4776 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121091 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121104 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121116 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121129 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121142 4776 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121154 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121165 4776 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121198 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121212 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121224 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121237 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121250 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121262 4776 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121274 4776 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121287 4776 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121300 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121312 4776 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121325 4776 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121337 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121350 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121363 4776 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121375 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121387 4776 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121399 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121410 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121423 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121434 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121446 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121463 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121476 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121489 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121502 4776 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121515 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121527 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121538 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121552 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121564 4776 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121575 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.121588 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.122301 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.123814 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.125422 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.125489 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.125710 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.127577 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.127599 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.127625 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.127718 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:02.627687917 +0000 UTC m=+18.890912949 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.128708 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.129867 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.132398 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.132952 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.133369 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.133521 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.133721 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.134123 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.134143 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.134158 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.134226 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:02.634212167 +0000 UTC m=+18.897437199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.140446 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.144427 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.144563 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.146360 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.146490 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.148590 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.151490 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.151611 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.151147 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.152327 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.152758 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.152805 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.152823 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.152858 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.160152 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.160785 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.164016 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.183003 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.203825 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.204309 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222482 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222541 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222593 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222609 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222623 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222634 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222646 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222656 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222666 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222678 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222689 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222702 4776 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222714 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222726 4776 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222738 4776 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222749 4776 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222761 4776 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222772 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222783 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222793 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222804 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222815 4776 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222826 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222839 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222851 4776 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222862 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222892 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222905 4776 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222916 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222926 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222937 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222948 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222959 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222970 4776 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.222981 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.223041 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.223216 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.343231 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.343364 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.343234 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.343567 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.347141 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.347637 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.348800 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.349484 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.350489 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.351002 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.351591 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.352506 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.353066 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.354027 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.354609 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.355779 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.356446 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.356956 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.357907 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.358631 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.359589 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.360095 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.360941 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.362058 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.362758 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.364159 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.364712 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.365863 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.366543 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.367315 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.368647 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.369356 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.370630 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.371433 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.372405 4776 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.372558 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.375292 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.376083 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.376770 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.378974 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.380051 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.381008 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.382679 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.383778 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.385019 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.386014 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.387564 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.389021 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.389698 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.390955 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.391564 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.392824 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.393273 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.393738 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.396116 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.396787 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.397512 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.398682 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.438103 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.450257 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.461822 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"98d3aaa724f1760545433baf497bf8177831b805358b581d77a5488a3c7dd812"} Dec 08 08:59:02 crc kubenswrapper[4776]: W1208 08:59:02.464885 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-ea0ab528885aaa7da98647b68c2563631085bed918b83149ced27d00d1f986a7 WatchSource:0}: Error finding container ea0ab528885aaa7da98647b68c2563631085bed918b83149ced27d00d1f986a7: Status 404 returned error can't find the container with id ea0ab528885aaa7da98647b68c2563631085bed918b83149ced27d00d1f986a7 Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.465378 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66"} Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.465405 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61"} Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.465438 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6eb375b39f41b7069ee8768a8c2d70551c004f78fc8e0431a420b45474fbcf2f"} Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.475334 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.484505 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.498667 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.509442 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.522247 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.533040 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.542100 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.629817 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.629963 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.630020 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 08:59:03.629994194 +0000 UTC m=+19.893219216 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.630077 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.630113 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.630198 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.630224 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.630274 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:03.630258111 +0000 UTC m=+19.893483133 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.630318 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:03.630298212 +0000 UTC m=+19.893523304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.630130 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.630358 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.630372 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.630410 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:03.630401755 +0000 UTC m=+19.893626877 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:02 crc kubenswrapper[4776]: I1208 08:59:02.731310 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.731488 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.731530 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.731548 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:02 crc kubenswrapper[4776]: E1208 08:59:02.731610 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:03.731590199 +0000 UTC m=+19.994815291 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.217429 4776 csr.go:261] certificate signing request csr-qp2db is approved, waiting to be issued Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.225940 4776 csr.go:257] certificate signing request csr-qp2db is issued Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.343085 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:03 crc kubenswrapper[4776]: E1208 08:59:03.343659 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.469814 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5"} Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.470892 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ea0ab528885aaa7da98647b68c2563631085bed918b83149ced27d00d1f986a7"} Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.484429 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:03Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.496583 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:03Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.507064 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:03Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.526709 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:03Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.538635 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:03Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.551729 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:03Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.562836 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:03Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.639105 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.639211 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.639244 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:03 crc kubenswrapper[4776]: E1208 08:59:03.639301 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 08:59:05.639270768 +0000 UTC m=+21.902495790 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.639355 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:03 crc kubenswrapper[4776]: E1208 08:59:03.639414 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 08:59:03 crc kubenswrapper[4776]: E1208 08:59:03.639448 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 08:59:03 crc kubenswrapper[4776]: E1208 08:59:03.639464 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:03 crc kubenswrapper[4776]: E1208 08:59:03.639480 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 08:59:03 crc kubenswrapper[4776]: E1208 08:59:03.639530 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:05.639506014 +0000 UTC m=+21.902731096 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:03 crc kubenswrapper[4776]: E1208 08:59:03.639560 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:05.639548115 +0000 UTC m=+21.902773257 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 08:59:03 crc kubenswrapper[4776]: E1208 08:59:03.639587 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 08:59:03 crc kubenswrapper[4776]: E1208 08:59:03.639622 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:05.639608297 +0000 UTC m=+21.902833429 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.740466 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:03 crc kubenswrapper[4776]: E1208 08:59:03.740593 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 08:59:03 crc kubenswrapper[4776]: E1208 08:59:03.740608 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 08:59:03 crc kubenswrapper[4776]: E1208 08:59:03.740622 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:03 crc kubenswrapper[4776]: E1208 08:59:03.740668 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:05.740655767 +0000 UTC m=+22.003880789 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:03 crc kubenswrapper[4776]: I1208 08:59:03.865647 4776 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.112346 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-jkmbn"] Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.112759 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.118352 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-fdg6t"] Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.118508 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.118728 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fdg6t" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.118807 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-5x9ft"] Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.119455 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.121254 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.122323 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.122738 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.123407 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.123484 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.123521 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.123568 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.125679 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.125736 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.125785 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.125951 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.134748 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.138708 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.189555 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.227251 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-08 08:54:03 +0000 UTC, rotation deadline is 2026-10-08 17:43:23.198899396 +0000 UTC Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.227935 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7304h44m18.970969702s for next certificate rotation Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.232825 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.246167 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58507405-6bea-4859-a4e8-6ed046b50323-system-cni-dir\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.246360 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z44lf\" (UniqueName: \"kubernetes.io/projected/56dfa7df-2ee8-4408-a283-5a8521175a0c-kube-api-access-z44lf\") pod \"node-resolver-fdg6t\" (UID: \"56dfa7df-2ee8-4408-a283-5a8521175a0c\") " pod="openshift-dns/node-resolver-fdg6t" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.246437 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42lpc\" (UniqueName: \"kubernetes.io/projected/c9788ab1-1031-4103-a769-a4b3177c7268-kube-api-access-42lpc\") pod \"machine-config-daemon-jkmbn\" (UID: \"c9788ab1-1031-4103-a769-a4b3177c7268\") " pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.246551 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c9788ab1-1031-4103-a769-a4b3177c7268-mcd-auth-proxy-config\") pod \"machine-config-daemon-jkmbn\" (UID: \"c9788ab1-1031-4103-a769-a4b3177c7268\") " pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.246637 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9788ab1-1031-4103-a769-a4b3177c7268-proxy-tls\") pod \"machine-config-daemon-jkmbn\" (UID: \"c9788ab1-1031-4103-a769-a4b3177c7268\") " pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.246716 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58507405-6bea-4859-a4e8-6ed046b50323-cnibin\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.246791 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58507405-6bea-4859-a4e8-6ed046b50323-cni-binary-copy\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.246863 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlvpw\" (UniqueName: \"kubernetes.io/projected/58507405-6bea-4859-a4e8-6ed046b50323-kube-api-access-hlvpw\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.246951 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c9788ab1-1031-4103-a769-a4b3177c7268-rootfs\") pod \"machine-config-daemon-jkmbn\" (UID: \"c9788ab1-1031-4103-a769-a4b3177c7268\") " pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.247024 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58507405-6bea-4859-a4e8-6ed046b50323-os-release\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.247099 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58507405-6bea-4859-a4e8-6ed046b50323-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.247196 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/56dfa7df-2ee8-4408-a283-5a8521175a0c-hosts-file\") pod \"node-resolver-fdg6t\" (UID: \"56dfa7df-2ee8-4408-a283-5a8521175a0c\") " pod="openshift-dns/node-resolver-fdg6t" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.247272 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/58507405-6bea-4859-a4e8-6ed046b50323-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.268091 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.285420 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.300783 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.314428 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.330881 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.343284 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.343312 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:04 crc kubenswrapper[4776]: E1208 08:59:04.343434 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:04 crc kubenswrapper[4776]: E1208 08:59:04.343587 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.348470 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c9788ab1-1031-4103-a769-a4b3177c7268-mcd-auth-proxy-config\") pod \"machine-config-daemon-jkmbn\" (UID: \"c9788ab1-1031-4103-a769-a4b3177c7268\") " pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.348506 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42lpc\" (UniqueName: \"kubernetes.io/projected/c9788ab1-1031-4103-a769-a4b3177c7268-kube-api-access-42lpc\") pod \"machine-config-daemon-jkmbn\" (UID: \"c9788ab1-1031-4103-a769-a4b3177c7268\") " pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.348534 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9788ab1-1031-4103-a769-a4b3177c7268-proxy-tls\") pod \"machine-config-daemon-jkmbn\" (UID: \"c9788ab1-1031-4103-a769-a4b3177c7268\") " pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.348559 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58507405-6bea-4859-a4e8-6ed046b50323-cnibin\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.348579 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58507405-6bea-4859-a4e8-6ed046b50323-cni-binary-copy\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.348600 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlvpw\" (UniqueName: \"kubernetes.io/projected/58507405-6bea-4859-a4e8-6ed046b50323-kube-api-access-hlvpw\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.348632 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c9788ab1-1031-4103-a769-a4b3177c7268-rootfs\") pod \"machine-config-daemon-jkmbn\" (UID: \"c9788ab1-1031-4103-a769-a4b3177c7268\") " pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.348667 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58507405-6bea-4859-a4e8-6ed046b50323-os-release\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.348693 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58507405-6bea-4859-a4e8-6ed046b50323-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.348716 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/56dfa7df-2ee8-4408-a283-5a8521175a0c-hosts-file\") pod \"node-resolver-fdg6t\" (UID: \"56dfa7df-2ee8-4408-a283-5a8521175a0c\") " pod="openshift-dns/node-resolver-fdg6t" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.348740 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/58507405-6bea-4859-a4e8-6ed046b50323-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.348776 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58507405-6bea-4859-a4e8-6ed046b50323-system-cni-dir\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.348803 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z44lf\" (UniqueName: \"kubernetes.io/projected/56dfa7df-2ee8-4408-a283-5a8521175a0c-kube-api-access-z44lf\") pod \"node-resolver-fdg6t\" (UID: \"56dfa7df-2ee8-4408-a283-5a8521175a0c\") " pod="openshift-dns/node-resolver-fdg6t" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.349105 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58507405-6bea-4859-a4e8-6ed046b50323-cnibin\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.349224 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c9788ab1-1031-4103-a769-a4b3177c7268-rootfs\") pod \"machine-config-daemon-jkmbn\" (UID: \"c9788ab1-1031-4103-a769-a4b3177c7268\") " pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.349344 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58507405-6bea-4859-a4e8-6ed046b50323-system-cni-dir\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.349656 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/56dfa7df-2ee8-4408-a283-5a8521175a0c-hosts-file\") pod \"node-resolver-fdg6t\" (UID: \"56dfa7df-2ee8-4408-a283-5a8521175a0c\") " pod="openshift-dns/node-resolver-fdg6t" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.349677 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c9788ab1-1031-4103-a769-a4b3177c7268-mcd-auth-proxy-config\") pod \"machine-config-daemon-jkmbn\" (UID: \"c9788ab1-1031-4103-a769-a4b3177c7268\") " pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.349735 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58507405-6bea-4859-a4e8-6ed046b50323-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.350027 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58507405-6bea-4859-a4e8-6ed046b50323-os-release\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.350094 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58507405-6bea-4859-a4e8-6ed046b50323-cni-binary-copy\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.350108 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/58507405-6bea-4859-a4e8-6ed046b50323-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.355969 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9788ab1-1031-4103-a769-a4b3177c7268-proxy-tls\") pod \"machine-config-daemon-jkmbn\" (UID: \"c9788ab1-1031-4103-a769-a4b3177c7268\") " pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.356544 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.366910 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlvpw\" (UniqueName: \"kubernetes.io/projected/58507405-6bea-4859-a4e8-6ed046b50323-kube-api-access-hlvpw\") pod \"multus-additional-cni-plugins-5x9ft\" (UID: \"58507405-6bea-4859-a4e8-6ed046b50323\") " pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.368657 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42lpc\" (UniqueName: \"kubernetes.io/projected/c9788ab1-1031-4103-a769-a4b3177c7268-kube-api-access-42lpc\") pod \"machine-config-daemon-jkmbn\" (UID: \"c9788ab1-1031-4103-a769-a4b3177c7268\") " pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.374131 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z44lf\" (UniqueName: \"kubernetes.io/projected/56dfa7df-2ee8-4408-a283-5a8521175a0c-kube-api-access-z44lf\") pod \"node-resolver-fdg6t\" (UID: \"56dfa7df-2ee8-4408-a283-5a8521175a0c\") " pod="openshift-dns/node-resolver-fdg6t" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.374349 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.389017 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.401897 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.419657 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.429468 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.436591 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fdg6t" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.439302 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.442020 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" Dec 08 08:59:04 crc kubenswrapper[4776]: W1208 08:59:04.442350 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9788ab1_1031_4103_a769_a4b3177c7268.slice/crio-c23000b129e7df8a8c68dc1faf552801e573bd35d7fae5702d6049723f89c6fc WatchSource:0}: Error finding container c23000b129e7df8a8c68dc1faf552801e573bd35d7fae5702d6049723f89c6fc: Status 404 returned error can't find the container with id c23000b129e7df8a8c68dc1faf552801e573bd35d7fae5702d6049723f89c6fc Dec 08 08:59:04 crc kubenswrapper[4776]: W1208 08:59:04.452060 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56dfa7df_2ee8_4408_a283_5a8521175a0c.slice/crio-d9299001c11b297d71b1e52cea8436d0972b85f99a0a246b374736e35f000e48 WatchSource:0}: Error finding container d9299001c11b297d71b1e52cea8436d0972b85f99a0a246b374736e35f000e48: Status 404 returned error can't find the container with id d9299001c11b297d71b1e52cea8436d0972b85f99a0a246b374736e35f000e48 Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.454843 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 08:59:04 crc kubenswrapper[4776]: W1208 08:59:04.456136 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58507405_6bea_4859_a4e8_6ed046b50323.slice/crio-ff93873585c761e86a69054089dc19b94c3318d44dc02f9b209a5c569d066bda WatchSource:0}: Error finding container ff93873585c761e86a69054089dc19b94c3318d44dc02f9b209a5c569d066bda: Status 404 returned error can't find the container with id ff93873585c761e86a69054089dc19b94c3318d44dc02f9b209a5c569d066bda Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.457352 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.460287 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.464753 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.474458 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.488479 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546"} Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.489791 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.491393 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" event={"ID":"58507405-6bea-4859-a4e8-6ed046b50323","Type":"ContainerStarted","Data":"ff93873585c761e86a69054089dc19b94c3318d44dc02f9b209a5c569d066bda"} Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.492658 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fdg6t" event={"ID":"56dfa7df-2ee8-4408-a283-5a8521175a0c","Type":"ContainerStarted","Data":"d9299001c11b297d71b1e52cea8436d0972b85f99a0a246b374736e35f000e48"} Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.493655 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"c23000b129e7df8a8c68dc1faf552801e573bd35d7fae5702d6049723f89c6fc"} Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.508503 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.530341 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.533470 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-555j6"] Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.533985 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.537751 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.538356 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-swbsc"] Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.538959 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.539488 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.543362 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.544651 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.544836 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.544997 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.545100 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.545245 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.545405 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.552391 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.575529 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.591910 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.606514 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.626024 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.642280 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.651767 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-systemd\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.651807 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-run-ovn-kubernetes\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.651823 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xks4z\" (UniqueName: \"kubernetes.io/projected/1e518469-5b3b-4055-a0f0-075dc48b1c79-kube-api-access-xks4z\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.651840 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-etc-kubernetes\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.651854 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-env-overrides\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.651870 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-openvswitch\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.651884 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovnkube-config\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.651901 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-os-release\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.651914 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-var-lib-cni-bin\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.651927 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-kubelet\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.651941 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-var-lib-openvswitch\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.651956 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-run-netns\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.652901 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.652959 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-multus-cni-dir\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.652983 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrhvg\" (UniqueName: \"kubernetes.io/projected/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-kube-api-access-hrhvg\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653004 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-run-netns\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653040 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-system-cni-dir\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653065 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-run-multus-certs\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653084 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-slash\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653121 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovnkube-script-lib\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653143 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-log-socket\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653163 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-run-k8s-cni-cncf-io\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653205 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-systemd-units\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653224 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-cni-binary-copy\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653242 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-cnibin\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653288 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-var-lib-kubelet\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653309 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-hostroot\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653330 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-multus-conf-dir\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653350 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-cni-bin\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653371 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-cni-netd\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653390 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-multus-socket-dir-parent\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653412 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-multus-daemon-config\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653431 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-etc-openvswitch\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653466 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-var-lib-cni-multus\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653487 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-node-log\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653506 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovn-node-metrics-cert\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.653539 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-ovn\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.659293 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.674626 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.690520 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.703708 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.720995 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.740195 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.754668 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-os-release\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.754710 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-openvswitch\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.754753 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovnkube-config\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.754771 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-run-netns\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.754839 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-openvswitch\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.754847 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-os-release\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.754890 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-var-lib-cni-bin\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.754935 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-kubelet\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.754961 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-kubelet\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.754966 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-var-lib-openvswitch\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.754964 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-run-netns\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.754996 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.754922 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-var-lib-cni-bin\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755036 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-system-cni-dir\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755065 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-multus-cni-dir\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755076 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755095 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrhvg\" (UniqueName: \"kubernetes.io/projected/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-kube-api-access-hrhvg\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755096 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-var-lib-openvswitch\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755150 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-run-netns\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755183 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-multus-cni-dir\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755122 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-run-netns\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755315 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-run-multus-certs\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755341 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-slash\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755373 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovnkube-script-lib\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755396 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-log-socket\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755416 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-run-k8s-cni-cncf-io\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755439 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-systemd-units\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755461 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-cni-binary-copy\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755493 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-cnibin\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755518 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-var-lib-kubelet\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755536 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-log-socket\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755542 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-hostroot\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755572 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-hostroot\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755598 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-multus-conf-dir\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755613 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-run-k8s-cni-cncf-io\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755646 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-cni-bin\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755654 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-systemd-units\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755670 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-cni-netd\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755697 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-multus-socket-dir-parent\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755717 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovnkube-config\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755722 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-multus-daemon-config\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755804 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-etc-openvswitch\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755848 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-var-lib-cni-multus\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755866 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-node-log\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755910 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovn-node-metrics-cert\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755956 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-ovn\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755982 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-etc-kubernetes\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756003 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-systemd\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756028 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-run-ovn-kubernetes\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756049 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xks4z\" (UniqueName: \"kubernetes.io/projected/1e518469-5b3b-4055-a0f0-075dc48b1c79-kube-api-access-xks4z\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756072 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-env-overrides\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756401 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-cni-binary-copy\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756463 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-cnibin\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756493 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-var-lib-kubelet\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756549 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-multus-daemon-config\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756586 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-etc-openvswitch\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756559 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-env-overrides\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756646 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-var-lib-cni-multus\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756653 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-multus-conf-dir\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756679 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-node-log\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756692 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-cni-bin\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756723 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-cni-netd\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756766 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-multus-socket-dir-parent\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756792 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-host-run-multus-certs\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.755247 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-system-cni-dir\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756831 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-slash\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756858 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-systemd\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756889 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-ovn\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756916 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-etc-kubernetes\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.756945 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-run-ovn-kubernetes\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.757018 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovnkube-script-lib\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.762577 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.762720 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovn-node-metrics-cert\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.772983 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrhvg\" (UniqueName: \"kubernetes.io/projected/775b9e97-3ad5-4003-a2c2-fc8dd58b69cc-kube-api-access-hrhvg\") pod \"multus-555j6\" (UID: \"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\") " pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.774070 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.776887 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xks4z\" (UniqueName: \"kubernetes.io/projected/1e518469-5b3b-4055-a0f0-075dc48b1c79-kube-api-access-xks4z\") pod \"ovnkube-node-swbsc\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.788690 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.820996 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.851436 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.859613 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-555j6" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.868650 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.869918 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.890358 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: W1208 08:59:04.896732 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod775b9e97_3ad5_4003_a2c2_fc8dd58b69cc.slice/crio-391a7b35f293012fd5ab14c84b077ab49d9ddbb8d896fcb9b4eae53c5717ee77 WatchSource:0}: Error finding container 391a7b35f293012fd5ab14c84b077ab49d9ddbb8d896fcb9b4eae53c5717ee77: Status 404 returned error can't find the container with id 391a7b35f293012fd5ab14c84b077ab49d9ddbb8d896fcb9b4eae53c5717ee77 Dec 08 08:59:04 crc kubenswrapper[4776]: W1208 08:59:04.904753 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e518469_5b3b_4055_a0f0_075dc48b1c79.slice/crio-500f23bea6efe33ec35e970e14b4348a4da597e5b10327f258814e19eb122b2b WatchSource:0}: Error finding container 500f23bea6efe33ec35e970e14b4348a4da597e5b10327f258814e19eb122b2b: Status 404 returned error can't find the container with id 500f23bea6efe33ec35e970e14b4348a4da597e5b10327f258814e19eb122b2b Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.909149 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.926001 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.943777 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:04 crc kubenswrapper[4776]: I1208 08:59:04.957554 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:04Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.175770 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.177517 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.177555 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.177564 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.177654 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.185309 4776 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.185577 4776 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.186628 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.186657 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.186666 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.186682 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.186693 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:05Z","lastTransitionTime":"2025-12-08T08:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.204221 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.208220 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.208263 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.208278 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.208294 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.208308 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:05Z","lastTransitionTime":"2025-12-08T08:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.221234 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.226407 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.226467 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.226482 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.226505 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.226522 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:05Z","lastTransitionTime":"2025-12-08T08:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.239329 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.244370 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.244423 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.244443 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.244473 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.244486 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:05Z","lastTransitionTime":"2025-12-08T08:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.258680 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.263186 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.263230 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.263242 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.263259 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.263270 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:05Z","lastTransitionTime":"2025-12-08T08:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.278285 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.278452 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.280084 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.280117 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.280142 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.280161 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.280195 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:05Z","lastTransitionTime":"2025-12-08T08:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.343259 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.343391 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.384663 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.385054 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.385064 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.385078 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.385087 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:05Z","lastTransitionTime":"2025-12-08T08:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.487418 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.487457 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.487466 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.487481 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.487492 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:05Z","lastTransitionTime":"2025-12-08T08:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.497570 4776 generic.go:334] "Generic (PLEG): container finished" podID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerID="1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e" exitCode=0 Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.497641 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerDied","Data":"1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e"} Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.497678 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerStarted","Data":"500f23bea6efe33ec35e970e14b4348a4da597e5b10327f258814e19eb122b2b"} Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.499586 4776 generic.go:334] "Generic (PLEG): container finished" podID="58507405-6bea-4859-a4e8-6ed046b50323" containerID="6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d" exitCode=0 Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.499766 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" event={"ID":"58507405-6bea-4859-a4e8-6ed046b50323","Type":"ContainerDied","Data":"6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d"} Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.501089 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fdg6t" event={"ID":"56dfa7df-2ee8-4408-a283-5a8521175a0c","Type":"ContainerStarted","Data":"5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29"} Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.504202 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31"} Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.504269 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8"} Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.505985 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-555j6" event={"ID":"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc","Type":"ContainerStarted","Data":"04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf"} Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.506021 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-555j6" event={"ID":"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc","Type":"ContainerStarted","Data":"391a7b35f293012fd5ab14c84b077ab49d9ddbb8d896fcb9b4eae53c5717ee77"} Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.515980 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.544779 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.558425 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.572425 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.586717 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.589957 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.590004 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.590014 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.590031 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.590082 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:05Z","lastTransitionTime":"2025-12-08T08:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.601139 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.613334 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.625899 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.640081 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.651991 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.660565 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.668014 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.668119 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 08:59:09.668102653 +0000 UTC m=+25.931327675 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.668148 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.668199 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.668230 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.668302 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.668336 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:09.668329419 +0000 UTC m=+25.931554441 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.668346 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.668418 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:09.668399111 +0000 UTC m=+25.931624213 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.668499 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.668531 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.668546 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.668578 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:09.668568395 +0000 UTC m=+25.931793527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.673788 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.693249 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.693286 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.693296 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.693308 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.693317 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:05Z","lastTransitionTime":"2025-12-08T08:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.694700 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.707206 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.722773 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.743743 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.754911 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.769162 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.769311 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.769340 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.769352 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:05 crc kubenswrapper[4776]: E1208 08:59:05.769407 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:09.76939116 +0000 UTC m=+26.032616172 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.772290 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.785430 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.795112 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.795148 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.795159 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.795194 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.795207 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:05Z","lastTransitionTime":"2025-12-08T08:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.796111 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.813988 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.828494 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.844934 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.868878 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.880543 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.892647 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:05Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.897423 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.897553 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.897567 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.897589 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:05 crc kubenswrapper[4776]: I1208 08:59:05.897603 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:05Z","lastTransitionTime":"2025-12-08T08:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.001034 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.001412 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.001424 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.001439 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.001448 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:06Z","lastTransitionTime":"2025-12-08T08:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.103789 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.104146 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.104208 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.104228 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.104245 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:06Z","lastTransitionTime":"2025-12-08T08:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.208114 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.208156 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.208167 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.208213 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.208224 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:06Z","lastTransitionTime":"2025-12-08T08:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.310141 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.310192 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.310201 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.310215 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.310225 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:06Z","lastTransitionTime":"2025-12-08T08:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.342995 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.343253 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:06 crc kubenswrapper[4776]: E1208 08:59:06.343323 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:06 crc kubenswrapper[4776]: E1208 08:59:06.343362 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.412951 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.412998 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.413010 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.413024 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.413034 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:06Z","lastTransitionTime":"2025-12-08T08:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.417389 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-8k6qx"] Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.417707 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8k6qx" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.419007 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.419198 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.421635 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.421827 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.438212 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.451803 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.461372 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.471088 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.475602 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58b2w\" (UniqueName: \"kubernetes.io/projected/5628a2b5-b886-4883-93a3-fefc471f19e0-kube-api-access-58b2w\") pod \"node-ca-8k6qx\" (UID: \"5628a2b5-b886-4883-93a3-fefc471f19e0\") " pod="openshift-image-registry/node-ca-8k6qx" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.475656 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5628a2b5-b886-4883-93a3-fefc471f19e0-serviceca\") pod \"node-ca-8k6qx\" (UID: \"5628a2b5-b886-4883-93a3-fefc471f19e0\") " pod="openshift-image-registry/node-ca-8k6qx" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.475672 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5628a2b5-b886-4883-93a3-fefc471f19e0-host\") pod \"node-ca-8k6qx\" (UID: \"5628a2b5-b886-4883-93a3-fefc471f19e0\") " pod="openshift-image-registry/node-ca-8k6qx" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.479731 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.484089 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.499746 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.512103 4776 generic.go:334] "Generic (PLEG): container finished" podID="58507405-6bea-4859-a4e8-6ed046b50323" containerID="64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41" exitCode=0 Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.512156 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" event={"ID":"58507405-6bea-4859-a4e8-6ed046b50323","Type":"ContainerDied","Data":"64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.512454 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.514749 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.514777 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.514789 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.514805 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.514818 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:06Z","lastTransitionTime":"2025-12-08T08:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.516561 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerStarted","Data":"a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.516598 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerStarted","Data":"3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.516612 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerStarted","Data":"712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.516620 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerStarted","Data":"0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.516628 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerStarted","Data":"9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.516636 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerStarted","Data":"9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.525502 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.536144 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.549255 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.561757 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.571693 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.576233 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58b2w\" (UniqueName: \"kubernetes.io/projected/5628a2b5-b886-4883-93a3-fefc471f19e0-kube-api-access-58b2w\") pod \"node-ca-8k6qx\" (UID: \"5628a2b5-b886-4883-93a3-fefc471f19e0\") " pod="openshift-image-registry/node-ca-8k6qx" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.576312 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5628a2b5-b886-4883-93a3-fefc471f19e0-serviceca\") pod \"node-ca-8k6qx\" (UID: \"5628a2b5-b886-4883-93a3-fefc471f19e0\") " pod="openshift-image-registry/node-ca-8k6qx" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.576329 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5628a2b5-b886-4883-93a3-fefc471f19e0-host\") pod \"node-ca-8k6qx\" (UID: \"5628a2b5-b886-4883-93a3-fefc471f19e0\") " pod="openshift-image-registry/node-ca-8k6qx" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.577042 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5628a2b5-b886-4883-93a3-fefc471f19e0-host\") pod \"node-ca-8k6qx\" (UID: \"5628a2b5-b886-4883-93a3-fefc471f19e0\") " pod="openshift-image-registry/node-ca-8k6qx" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.578471 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5628a2b5-b886-4883-93a3-fefc471f19e0-serviceca\") pod \"node-ca-8k6qx\" (UID: \"5628a2b5-b886-4883-93a3-fefc471f19e0\") " pod="openshift-image-registry/node-ca-8k6qx" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.582424 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.590777 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.600340 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58b2w\" (UniqueName: \"kubernetes.io/projected/5628a2b5-b886-4883-93a3-fefc471f19e0-kube-api-access-58b2w\") pod \"node-ca-8k6qx\" (UID: \"5628a2b5-b886-4883-93a3-fefc471f19e0\") " pod="openshift-image-registry/node-ca-8k6qx" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.608755 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.617234 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.617274 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.617288 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.617313 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.617330 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:06Z","lastTransitionTime":"2025-12-08T08:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.622251 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.633867 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.645274 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.658640 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.673620 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.688031 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.699512 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.711814 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.721295 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.721342 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.721354 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.721376 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.721387 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:06Z","lastTransitionTime":"2025-12-08T08:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.729018 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8k6qx" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.731261 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.733192 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.746838 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 08 08:59:06 crc kubenswrapper[4776]: W1208 08:59:06.747486 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5628a2b5_b886_4883_93a3_fefc471f19e0.slice/crio-d3c6affed2ad20e3aed0d8772b92d44ce87f7b4a1f4d1c22668221d2bfd88533 WatchSource:0}: Error finding container d3c6affed2ad20e3aed0d8772b92d44ce87f7b4a1f4d1c22668221d2bfd88533: Status 404 returned error can't find the container with id d3c6affed2ad20e3aed0d8772b92d44ce87f7b4a1f4d1c22668221d2bfd88533 Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.749209 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.749440 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.770165 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.786343 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.799646 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.822813 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.824121 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.824193 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.824207 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.824227 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.824243 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:06Z","lastTransitionTime":"2025-12-08T08:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.836373 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.857077 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.869936 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.883899 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.926485 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.926537 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.926546 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.926570 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.926581 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:06Z","lastTransitionTime":"2025-12-08T08:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.927213 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:06 crc kubenswrapper[4776]: I1208 08:59:06.965473 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:06Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.002689 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.029416 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.029483 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.029493 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.029513 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.029525 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:07Z","lastTransitionTime":"2025-12-08T08:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.045625 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.092371 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.127286 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.132758 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.132827 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.132843 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.132866 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.132883 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:07Z","lastTransitionTime":"2025-12-08T08:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.167237 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.203453 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.235935 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.235980 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.235994 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.236017 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.236035 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:07Z","lastTransitionTime":"2025-12-08T08:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.247165 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.284357 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.338756 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.338801 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.338809 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.338826 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.338836 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:07Z","lastTransitionTime":"2025-12-08T08:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.343340 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:07 crc kubenswrapper[4776]: E1208 08:59:07.343485 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.441014 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.441092 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.441110 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.441146 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.441165 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:07Z","lastTransitionTime":"2025-12-08T08:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.530065 4776 generic.go:334] "Generic (PLEG): container finished" podID="58507405-6bea-4859-a4e8-6ed046b50323" containerID="63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7" exitCode=0 Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.530140 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" event={"ID":"58507405-6bea-4859-a4e8-6ed046b50323","Type":"ContainerDied","Data":"63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7"} Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.533347 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8k6qx" event={"ID":"5628a2b5-b886-4883-93a3-fefc471f19e0","Type":"ContainerStarted","Data":"df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d"} Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.533390 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8k6qx" event={"ID":"5628a2b5-b886-4883-93a3-fefc471f19e0","Type":"ContainerStarted","Data":"d3c6affed2ad20e3aed0d8772b92d44ce87f7b4a1f4d1c22668221d2bfd88533"} Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.543759 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.543804 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.543820 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.543839 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.543857 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:07Z","lastTransitionTime":"2025-12-08T08:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.566093 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.581747 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.602854 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.620477 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.637579 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.645694 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.645730 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.645742 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.645760 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.645773 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:07Z","lastTransitionTime":"2025-12-08T08:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.653256 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.664376 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.677986 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.689606 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.702926 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.728577 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.747859 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.747900 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.747911 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.747928 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.747937 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:07Z","lastTransitionTime":"2025-12-08T08:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.786893 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.809097 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.842543 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.850295 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.850341 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.850350 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.850364 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.850373 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:07Z","lastTransitionTime":"2025-12-08T08:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.881104 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.935432 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.952648 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.952686 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.952694 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.952707 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.952716 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:07Z","lastTransitionTime":"2025-12-08T08:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:07 crc kubenswrapper[4776]: I1208 08:59:07.962828 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:07Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.008777 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.042135 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.055775 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.055854 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.055876 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.055906 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.055927 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:08Z","lastTransitionTime":"2025-12-08T08:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.085566 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.131124 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.158591 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.158674 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.158693 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.158757 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.158783 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:08Z","lastTransitionTime":"2025-12-08T08:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.167485 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.203284 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.248382 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.261447 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.261481 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.261489 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.261503 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.261514 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:08Z","lastTransitionTime":"2025-12-08T08:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.290814 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.324923 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.342994 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.343028 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:08 crc kubenswrapper[4776]: E1208 08:59:08.343139 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:08 crc kubenswrapper[4776]: E1208 08:59:08.343283 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.365442 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.365482 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.365492 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.365507 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.365523 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:08Z","lastTransitionTime":"2025-12-08T08:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.370049 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.404106 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.442101 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.468092 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.468264 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.468283 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.468304 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.468316 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:08Z","lastTransitionTime":"2025-12-08T08:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.481875 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.541405 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerStarted","Data":"e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca"} Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.543902 4776 generic.go:334] "Generic (PLEG): container finished" podID="58507405-6bea-4859-a4e8-6ed046b50323" containerID="fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be" exitCode=0 Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.543935 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" event={"ID":"58507405-6bea-4859-a4e8-6ed046b50323","Type":"ContainerDied","Data":"fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be"} Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.566225 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.570946 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.570992 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.571005 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.571027 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.571040 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:08Z","lastTransitionTime":"2025-12-08T08:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.582279 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.604919 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.643429 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.674362 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.674394 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.674404 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.674423 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.674436 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:08Z","lastTransitionTime":"2025-12-08T08:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.681927 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.730157 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.764811 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.776722 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.776765 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.776776 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.776793 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.776806 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:08Z","lastTransitionTime":"2025-12-08T08:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.806550 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.846331 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.879913 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.879948 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.879958 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.879971 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.879981 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:08Z","lastTransitionTime":"2025-12-08T08:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.887445 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.926193 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.964975 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:08Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.982696 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.982736 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.982784 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.982806 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:08 crc kubenswrapper[4776]: I1208 08:59:08.982819 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:08Z","lastTransitionTime":"2025-12-08T08:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.003383 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.041676 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.084379 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.085110 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.085136 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.085146 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.085160 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.085191 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:09Z","lastTransitionTime":"2025-12-08T08:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.187696 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.187723 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.187733 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.187744 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.187754 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:09Z","lastTransitionTime":"2025-12-08T08:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.290303 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.290331 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.290341 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.290356 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.290367 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:09Z","lastTransitionTime":"2025-12-08T08:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.342609 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:09 crc kubenswrapper[4776]: E1208 08:59:09.342725 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.395531 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.395565 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.395583 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.395600 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.395611 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:09Z","lastTransitionTime":"2025-12-08T08:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.497694 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.497722 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.497730 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.497743 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.497754 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:09Z","lastTransitionTime":"2025-12-08T08:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.552997 4776 generic.go:334] "Generic (PLEG): container finished" podID="58507405-6bea-4859-a4e8-6ed046b50323" containerID="51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d" exitCode=0 Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.553045 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" event={"ID":"58507405-6bea-4859-a4e8-6ed046b50323","Type":"ContainerDied","Data":"51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d"} Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.568629 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.587634 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.598234 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.600207 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.600244 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.600256 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.600272 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.600283 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:09Z","lastTransitionTime":"2025-12-08T08:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.612774 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.629020 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.645083 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.656442 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.667193 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.680533 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.692905 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.703136 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.703263 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.703293 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.703304 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.703320 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.703334 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:09Z","lastTransitionTime":"2025-12-08T08:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.706152 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.706268 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:09 crc kubenswrapper[4776]: E1208 08:59:09.706295 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 08:59:17.706270908 +0000 UTC m=+33.969495960 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 08:59:09 crc kubenswrapper[4776]: E1208 08:59:09.706346 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.706378 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:09 crc kubenswrapper[4776]: E1208 08:59:09.706392 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:17.70637829 +0000 UTC m=+33.969603322 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.706420 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:09 crc kubenswrapper[4776]: E1208 08:59:09.706517 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 08:59:09 crc kubenswrapper[4776]: E1208 08:59:09.706572 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:17.706564035 +0000 UTC m=+33.969789057 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 08:59:09 crc kubenswrapper[4776]: E1208 08:59:09.706526 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 08:59:09 crc kubenswrapper[4776]: E1208 08:59:09.706597 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 08:59:09 crc kubenswrapper[4776]: E1208 08:59:09.706608 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:09 crc kubenswrapper[4776]: E1208 08:59:09.706633 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:17.706626897 +0000 UTC m=+33.969851909 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.712729 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.734386 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.749889 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.763581 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:09Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.805624 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.805654 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.805663 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.805676 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.805685 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:09Z","lastTransitionTime":"2025-12-08T08:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.807570 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:09 crc kubenswrapper[4776]: E1208 08:59:09.807756 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 08:59:09 crc kubenswrapper[4776]: E1208 08:59:09.807797 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 08:59:09 crc kubenswrapper[4776]: E1208 08:59:09.807814 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:09 crc kubenswrapper[4776]: E1208 08:59:09.807879 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:17.807859882 +0000 UTC m=+34.071084924 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.908315 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.908349 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.908357 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.908371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:09 crc kubenswrapper[4776]: I1208 08:59:09.908382 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:09Z","lastTransitionTime":"2025-12-08T08:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.011680 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.011747 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.011764 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.011791 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.011811 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:10Z","lastTransitionTime":"2025-12-08T08:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.115166 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.115230 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.115246 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.115264 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.115278 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:10Z","lastTransitionTime":"2025-12-08T08:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.217660 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.217695 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.217703 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.217717 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.217726 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:10Z","lastTransitionTime":"2025-12-08T08:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.320615 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.320660 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.320668 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.320682 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.320690 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:10Z","lastTransitionTime":"2025-12-08T08:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.343419 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.343419 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:10 crc kubenswrapper[4776]: E1208 08:59:10.343589 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:10 crc kubenswrapper[4776]: E1208 08:59:10.343659 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.423583 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.423613 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.423622 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.423636 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.423644 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:10Z","lastTransitionTime":"2025-12-08T08:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.527757 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.527816 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.527825 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.527840 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.527849 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:10Z","lastTransitionTime":"2025-12-08T08:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.562776 4776 generic.go:334] "Generic (PLEG): container finished" podID="58507405-6bea-4859-a4e8-6ed046b50323" containerID="b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3" exitCode=0 Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.562857 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" event={"ID":"58507405-6bea-4859-a4e8-6ed046b50323","Type":"ContainerDied","Data":"b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3"} Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.575035 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:10Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.595978 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:10Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.611009 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:10Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.629670 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.629708 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.629722 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.629739 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.629751 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:10Z","lastTransitionTime":"2025-12-08T08:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.631465 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:10Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.654875 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:10Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.669667 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:10Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.680673 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:10Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.692074 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:10Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.714048 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:10Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.726449 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:10Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.731959 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.731991 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.731999 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.732013 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.732025 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:10Z","lastTransitionTime":"2025-12-08T08:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.738583 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:10Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.751009 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:10Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.763574 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:10Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.779165 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:10Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.791816 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:10Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.834578 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.834624 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.834637 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.834656 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.834671 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:10Z","lastTransitionTime":"2025-12-08T08:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.937162 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.937247 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.937269 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.937288 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:10 crc kubenswrapper[4776]: I1208 08:59:10.937300 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:10Z","lastTransitionTime":"2025-12-08T08:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.040563 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.040603 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.040611 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.040625 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.040634 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:11Z","lastTransitionTime":"2025-12-08T08:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.143510 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.143589 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.143607 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.143638 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.143656 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:11Z","lastTransitionTime":"2025-12-08T08:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.246425 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.246479 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.246495 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.246530 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.246552 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:11Z","lastTransitionTime":"2025-12-08T08:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.343260 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:11 crc kubenswrapper[4776]: E1208 08:59:11.343388 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.348863 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.348889 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.348900 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.348914 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.348927 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:11Z","lastTransitionTime":"2025-12-08T08:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.451025 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.451061 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.451070 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.451082 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.451091 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:11Z","lastTransitionTime":"2025-12-08T08:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.552750 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.552790 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.552799 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.552814 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.552824 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:11Z","lastTransitionTime":"2025-12-08T08:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.568764 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" event={"ID":"58507405-6bea-4859-a4e8-6ed046b50323","Type":"ContainerStarted","Data":"06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91"} Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.572857 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerStarted","Data":"3fa4ef35cbccec29c2e64d71b738e1432df60785145aca17763c0e2581a92439"} Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.573463 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.573516 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.573527 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.586244 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.598531 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.600377 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.600763 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.611948 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.622546 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.632798 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.654813 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.654873 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.654911 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.654927 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.654936 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:11Z","lastTransitionTime":"2025-12-08T08:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.687324 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.702812 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.716691 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.725158 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.738569 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.754450 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.756749 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.756783 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.756791 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.756806 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.756815 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:11Z","lastTransitionTime":"2025-12-08T08:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.765481 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.780788 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.792451 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.803848 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.815921 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.827100 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.837971 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.848140 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.857650 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.858530 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.858578 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.858589 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.858605 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.858614 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:11Z","lastTransitionTime":"2025-12-08T08:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.878911 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.890654 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.903440 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.913026 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.923418 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.935742 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.958717 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fa4ef35cbccec29c2e64d71b738e1432df60785145aca17763c0e2581a92439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.960989 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.961023 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.961033 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.961047 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.961057 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:11Z","lastTransitionTime":"2025-12-08T08:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.970888 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.983778 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:11 crc kubenswrapper[4776]: I1208 08:59:11.994667 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:11Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.062697 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.062735 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.062747 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.062762 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.062774 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:12Z","lastTransitionTime":"2025-12-08T08:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.165371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.165441 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.165463 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.165492 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.165518 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:12Z","lastTransitionTime":"2025-12-08T08:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.267671 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.267724 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.267742 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.267765 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.267784 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:12Z","lastTransitionTime":"2025-12-08T08:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.343728 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:12 crc kubenswrapper[4776]: E1208 08:59:12.343939 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.343738 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:12 crc kubenswrapper[4776]: E1208 08:59:12.344577 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.371587 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.371642 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.371662 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.371690 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.371710 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:12Z","lastTransitionTime":"2025-12-08T08:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.474772 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.474821 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.474837 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.474864 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.474884 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:12Z","lastTransitionTime":"2025-12-08T08:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.577219 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.577273 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.577285 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.577302 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.577315 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:12Z","lastTransitionTime":"2025-12-08T08:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.679789 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.679863 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.679888 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.679925 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.679950 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:12Z","lastTransitionTime":"2025-12-08T08:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.785685 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.786134 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.786243 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.786329 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.786443 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:12Z","lastTransitionTime":"2025-12-08T08:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.889708 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.889761 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.889770 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.889787 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.889800 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:12Z","lastTransitionTime":"2025-12-08T08:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.992244 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.992287 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.992299 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.992313 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:12 crc kubenswrapper[4776]: I1208 08:59:12.992323 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:12Z","lastTransitionTime":"2025-12-08T08:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.094423 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.094465 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.094473 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.094488 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.094499 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:13Z","lastTransitionTime":"2025-12-08T08:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.197235 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.197290 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.197307 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.197333 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.197352 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:13Z","lastTransitionTime":"2025-12-08T08:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.299940 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.299990 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.299999 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.300015 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.300026 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:13Z","lastTransitionTime":"2025-12-08T08:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.343497 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:13 crc kubenswrapper[4776]: E1208 08:59:13.343611 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.403036 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.403098 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.403116 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.403139 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.403158 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:13Z","lastTransitionTime":"2025-12-08T08:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.506017 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.506078 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.506095 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.506120 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.506138 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:13Z","lastTransitionTime":"2025-12-08T08:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.608683 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.608753 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.608777 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.608807 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.608828 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:13Z","lastTransitionTime":"2025-12-08T08:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.712294 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.712398 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.712424 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.712464 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.712493 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:13Z","lastTransitionTime":"2025-12-08T08:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.815263 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.815314 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.815326 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.815343 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.815354 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:13Z","lastTransitionTime":"2025-12-08T08:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.918365 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.918439 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.918457 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.918482 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:13 crc kubenswrapper[4776]: I1208 08:59:13.918499 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:13Z","lastTransitionTime":"2025-12-08T08:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.021382 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.021424 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.021437 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.021453 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.021465 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:14Z","lastTransitionTime":"2025-12-08T08:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.124631 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.124995 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.125016 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.125043 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.125062 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:14Z","lastTransitionTime":"2025-12-08T08:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.227573 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.227637 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.227648 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.227669 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.227683 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:14Z","lastTransitionTime":"2025-12-08T08:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.330565 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.330649 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.330667 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.330689 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.330706 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:14Z","lastTransitionTime":"2025-12-08T08:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.343325 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.343412 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:14 crc kubenswrapper[4776]: E1208 08:59:14.343552 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:14 crc kubenswrapper[4776]: E1208 08:59:14.343758 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.360097 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.384749 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.396002 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.408002 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.418818 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.432418 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.432467 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.432483 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.432504 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.432519 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:14Z","lastTransitionTime":"2025-12-08T08:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.442205 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.463058 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fa4ef35cbccec29c2e64d71b738e1432df60785145aca17763c0e2581a92439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.473970 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.495411 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.508550 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.520666 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.535117 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.535327 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.535359 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.535373 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.535392 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.535404 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:14Z","lastTransitionTime":"2025-12-08T08:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.549266 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.561097 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.572848 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.586761 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovnkube-controller/0.log" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.593477 4776 generic.go:334] "Generic (PLEG): container finished" podID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerID="3fa4ef35cbccec29c2e64d71b738e1432df60785145aca17763c0e2581a92439" exitCode=1 Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.593524 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerDied","Data":"3fa4ef35cbccec29c2e64d71b738e1432df60785145aca17763c0e2581a92439"} Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.594297 4776 scope.go:117] "RemoveContainer" containerID="3fa4ef35cbccec29c2e64d71b738e1432df60785145aca17763c0e2581a92439" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.614479 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.634630 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.638893 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.638945 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.638959 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.638987 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.639007 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:14Z","lastTransitionTime":"2025-12-08T08:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.656896 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.673486 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.686056 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.699973 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.710932 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.719620 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.736915 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.740988 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.741024 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.741036 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.741053 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.741065 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:14Z","lastTransitionTime":"2025-12-08T08:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.755728 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fa4ef35cbccec29c2e64d71b738e1432df60785145aca17763c0e2581a92439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fa4ef35cbccec29c2e64d71b738e1432df60785145aca17763c0e2581a92439\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:14Z\\\",\\\"message\\\":\\\"v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.436245 6046 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.436384 6046 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 08:59:13.436569 6046 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1208 08:59:13.436657 6046 factory.go:656] Stopping watch factory\\\\nI1208 08:59:13.436680 6046 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1208 08:59:13.436897 6046 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.437034 6046 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.437208 6046 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.437451 6046 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1208 08:59:13.437785 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.769631 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.783941 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.796356 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.815365 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.828973 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:14Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.843632 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.843681 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.843691 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.843709 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.843720 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:14Z","lastTransitionTime":"2025-12-08T08:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.945890 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.945936 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.945951 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.945971 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:14 crc kubenswrapper[4776]: I1208 08:59:14.945986 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:14Z","lastTransitionTime":"2025-12-08T08:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.049356 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.049407 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.049426 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.049455 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.049473 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:15Z","lastTransitionTime":"2025-12-08T08:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.153260 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.153330 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.153353 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.153382 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.153406 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:15Z","lastTransitionTime":"2025-12-08T08:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.268019 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.268092 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.268108 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.268132 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.268152 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:15Z","lastTransitionTime":"2025-12-08T08:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.343537 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:15 crc kubenswrapper[4776]: E1208 08:59:15.343682 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.370503 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.370545 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.370553 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.370570 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.370580 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:15Z","lastTransitionTime":"2025-12-08T08:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.473521 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.473562 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.473572 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.473593 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.473604 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:15Z","lastTransitionTime":"2025-12-08T08:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.576669 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.576726 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.576738 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.576760 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.576772 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:15Z","lastTransitionTime":"2025-12-08T08:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.578224 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.578272 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.578281 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.578301 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.578316 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:15Z","lastTransitionTime":"2025-12-08T08:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:15 crc kubenswrapper[4776]: E1208 08:59:15.591307 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.596370 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.596423 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.596438 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.596463 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.596478 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:15Z","lastTransitionTime":"2025-12-08T08:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.600093 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovnkube-controller/0.log" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.602594 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerStarted","Data":"61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f"} Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.603464 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:15 crc kubenswrapper[4776]: E1208 08:59:15.612865 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.616704 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.617958 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.618030 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.618053 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.618083 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.618105 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:15Z","lastTransitionTime":"2025-12-08T08:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:15 crc kubenswrapper[4776]: E1208 08:59:15.632962 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.633679 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.637263 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.637320 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.637338 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.637367 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.637384 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:15Z","lastTransitionTime":"2025-12-08T08:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.646850 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: E1208 08:59:15.650118 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.653350 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.653397 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.653407 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.653425 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.653438 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:15Z","lastTransitionTime":"2025-12-08T08:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.659642 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: E1208 08:59:15.663876 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: E1208 08:59:15.664030 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.671557 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.679471 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.679533 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.679548 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.679571 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.679590 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:15Z","lastTransitionTime":"2025-12-08T08:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.694274 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.709614 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.724864 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.743108 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.755328 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.766737 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.781663 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.781712 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.781726 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.781749 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.781764 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:15Z","lastTransitionTime":"2025-12-08T08:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.785275 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fa4ef35cbccec29c2e64d71b738e1432df60785145aca17763c0e2581a92439\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:14Z\\\",\\\"message\\\":\\\"v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.436245 6046 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.436384 6046 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 08:59:13.436569 6046 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1208 08:59:13.436657 6046 factory.go:656] Stopping watch factory\\\\nI1208 08:59:13.436680 6046 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1208 08:59:13.436897 6046 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.437034 6046 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.437208 6046 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.437451 6046 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1208 08:59:13.437785 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.800094 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.822641 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.837303 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.884374 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.884649 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.884782 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.884917 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.885017 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:15Z","lastTransitionTime":"2025-12-08T08:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.936193 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld"] Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.936875 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.939554 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.939718 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.954042 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.971483 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.974144 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d62ee56-e13f-4e44-8abf-9d0fc3e423b8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q85ld\" (UID: \"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.974305 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d62ee56-e13f-4e44-8abf-9d0fc3e423b8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q85ld\" (UID: \"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.974366 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plplv\" (UniqueName: \"kubernetes.io/projected/2d62ee56-e13f-4e44-8abf-9d0fc3e423b8-kube-api-access-plplv\") pod \"ovnkube-control-plane-749d76644c-q85ld\" (UID: \"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.974438 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d62ee56-e13f-4e44-8abf-9d0fc3e423b8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q85ld\" (UID: \"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.986899 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.986953 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.986968 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.986990 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.987004 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:15Z","lastTransitionTime":"2025-12-08T08:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:15 crc kubenswrapper[4776]: I1208 08:59:15.996718 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:15Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.011698 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.032274 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.045823 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.061084 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.075467 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d62ee56-e13f-4e44-8abf-9d0fc3e423b8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q85ld\" (UID: \"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.075575 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d62ee56-e13f-4e44-8abf-9d0fc3e423b8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q85ld\" (UID: \"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.075660 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d62ee56-e13f-4e44-8abf-9d0fc3e423b8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q85ld\" (UID: \"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.075711 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plplv\" (UniqueName: \"kubernetes.io/projected/2d62ee56-e13f-4e44-8abf-9d0fc3e423b8-kube-api-access-plplv\") pod \"ovnkube-control-plane-749d76644c-q85ld\" (UID: \"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.075812 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.076165 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d62ee56-e13f-4e44-8abf-9d0fc3e423b8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q85ld\" (UID: \"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.076978 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d62ee56-e13f-4e44-8abf-9d0fc3e423b8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q85ld\" (UID: \"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.088223 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d62ee56-e13f-4e44-8abf-9d0fc3e423b8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q85ld\" (UID: \"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.090197 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.090263 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.090278 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.090303 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.090323 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:16Z","lastTransitionTime":"2025-12-08T08:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.093943 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plplv\" (UniqueName: \"kubernetes.io/projected/2d62ee56-e13f-4e44-8abf-9d0fc3e423b8-kube-api-access-plplv\") pod \"ovnkube-control-plane-749d76644c-q85ld\" (UID: \"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.099848 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.123042 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fa4ef35cbccec29c2e64d71b738e1432df60785145aca17763c0e2581a92439\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:14Z\\\",\\\"message\\\":\\\"v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.436245 6046 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.436384 6046 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 08:59:13.436569 6046 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1208 08:59:13.436657 6046 factory.go:656] Stopping watch factory\\\\nI1208 08:59:13.436680 6046 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1208 08:59:13.436897 6046 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.437034 6046 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.437208 6046 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.437451 6046 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1208 08:59:13.437785 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.138410 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.152871 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.164977 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.182217 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.192899 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.192938 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.192950 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.192967 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.192981 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:16Z","lastTransitionTime":"2025-12-08T08:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.194003 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.208378 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.251993 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" Dec 08 08:59:16 crc kubenswrapper[4776]: W1208 08:59:16.277241 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d62ee56_e13f_4e44_8abf_9d0fc3e423b8.slice/crio-f95232676e8594dc0355088c6ce34f78bd4cd20b6a72764390785afde303af23 WatchSource:0}: Error finding container f95232676e8594dc0355088c6ce34f78bd4cd20b6a72764390785afde303af23: Status 404 returned error can't find the container with id f95232676e8594dc0355088c6ce34f78bd4cd20b6a72764390785afde303af23 Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.296364 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.297161 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.297245 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.297286 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.297314 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:16Z","lastTransitionTime":"2025-12-08T08:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.343043 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.343118 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:16 crc kubenswrapper[4776]: E1208 08:59:16.343295 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:16 crc kubenswrapper[4776]: E1208 08:59:16.343468 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.401395 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.401449 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.401465 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.401487 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.401501 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:16Z","lastTransitionTime":"2025-12-08T08:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.505926 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.505974 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.505985 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.506007 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.506022 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:16Z","lastTransitionTime":"2025-12-08T08:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.608743 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.608781 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.608791 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.608811 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.608822 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:16Z","lastTransitionTime":"2025-12-08T08:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.610816 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovnkube-controller/1.log" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.611700 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovnkube-controller/0.log" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.615296 4776 generic.go:334] "Generic (PLEG): container finished" podID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerID="61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f" exitCode=1 Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.615403 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerDied","Data":"61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f"} Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.615477 4776 scope.go:117] "RemoveContainer" containerID="3fa4ef35cbccec29c2e64d71b738e1432df60785145aca17763c0e2581a92439" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.616947 4776 scope.go:117] "RemoveContainer" containerID="61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f" Dec 08 08:59:16 crc kubenswrapper[4776]: E1208 08:59:16.617249 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\"" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.621614 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" event={"ID":"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8","Type":"ContainerStarted","Data":"f95232676e8594dc0355088c6ce34f78bd4cd20b6a72764390785afde303af23"} Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.645748 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fa4ef35cbccec29c2e64d71b738e1432df60785145aca17763c0e2581a92439\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:14Z\\\",\\\"message\\\":\\\"v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.436245 6046 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.436384 6046 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 08:59:13.436569 6046 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1208 08:59:13.436657 6046 factory.go:656] Stopping watch factory\\\\nI1208 08:59:13.436680 6046 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1208 08:59:13.436897 6046 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.437034 6046 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.437208 6046 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 08:59:13.437451 6046 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1208 08:59:13.437785 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"bj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721761 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721766 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1208 08:59:15.721771 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1208 08:59:15.721775 6173 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721624 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1208 08:59:15.721793 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1208 08:59:15.721797 6173 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1208 08:59:15.721633 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1208 08:59:15.721806 6173 ovn.go:134] Ensuring zone local for Pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.664458 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.686825 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.703312 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.712692 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.712731 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.712742 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.712761 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.712774 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:16Z","lastTransitionTime":"2025-12-08T08:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.722763 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.738238 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.760393 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.779593 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.793994 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.812080 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.826683 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.826757 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.826774 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.826799 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.826816 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:16Z","lastTransitionTime":"2025-12-08T08:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.839804 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.855210 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.872055 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.895329 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.914753 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.930111 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.930170 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.930222 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.930251 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.930272 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:16Z","lastTransitionTime":"2025-12-08T08:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:16 crc kubenswrapper[4776]: I1208 08:59:16.933111 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:16Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.034216 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.034676 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.034697 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.034726 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.034745 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:17Z","lastTransitionTime":"2025-12-08T08:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.142836 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.142937 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.142962 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.142993 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.143018 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:17Z","lastTransitionTime":"2025-12-08T08:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.247273 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.247340 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.247359 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.247388 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.247442 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:17Z","lastTransitionTime":"2025-12-08T08:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.343342 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:17 crc kubenswrapper[4776]: E1208 08:59:17.343630 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.351622 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.351688 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.351705 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.351727 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.351741 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:17Z","lastTransitionTime":"2025-12-08T08:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.456539 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.456601 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.456611 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.456628 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.456639 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:17Z","lastTransitionTime":"2025-12-08T08:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.559453 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.559482 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.559492 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.559506 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.559515 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:17Z","lastTransitionTime":"2025-12-08T08:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.625718 4776 scope.go:117] "RemoveContainer" containerID="61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f" Dec 08 08:59:17 crc kubenswrapper[4776]: E1208 08:59:17.625885 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\"" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.637743 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.648672 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.662033 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.662113 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.662149 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.662220 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.662243 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:17Z","lastTransitionTime":"2025-12-08T08:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.662726 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.684261 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"bj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721761 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721766 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1208 08:59:15.721771 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1208 08:59:15.721775 6173 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721624 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1208 08:59:15.721793 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1208 08:59:15.721797 6173 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1208 08:59:15.721633 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1208 08:59:15.721806 6173 ovn.go:134] Ensuring zone local for Pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.706598 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 08:59:17 crc kubenswrapper[4776]: E1208 08:59:17.706724 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 08:59:33.706704848 +0000 UTC m=+49.969929870 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.706804 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.706858 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:17 crc kubenswrapper[4776]: E1208 08:59:17.706994 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 08:59:17 crc kubenswrapper[4776]: E1208 08:59:17.707012 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 08:59:17 crc kubenswrapper[4776]: E1208 08:59:17.707026 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:17 crc kubenswrapper[4776]: E1208 08:59:17.707066 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:33.707053377 +0000 UTC m=+49.970278399 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.707488 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:17 crc kubenswrapper[4776]: E1208 08:59:17.707577 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 08:59:17 crc kubenswrapper[4776]: E1208 08:59:17.707611 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:33.707601471 +0000 UTC m=+49.970826493 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 08:59:17 crc kubenswrapper[4776]: E1208 08:59:17.707373 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 08:59:17 crc kubenswrapper[4776]: E1208 08:59:17.707653 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:33.707646092 +0000 UTC m=+49.970871114 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.711210 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.735057 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.757035 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.764638 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.764701 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.764717 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.764751 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.764765 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:17Z","lastTransitionTime":"2025-12-08T08:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.777277 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.794153 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.807935 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:17 crc kubenswrapper[4776]: E1208 08:59:17.808081 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 08:59:17 crc kubenswrapper[4776]: E1208 08:59:17.808099 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 08:59:17 crc kubenswrapper[4776]: E1208 08:59:17.808111 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:17 crc kubenswrapper[4776]: E1208 08:59:17.808152 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:33.808138419 +0000 UTC m=+50.071363441 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.808424 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.822527 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.826310 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-kkhjg"] Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.827215 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:17 crc kubenswrapper[4776]: E1208 08:59:17.827385 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.836805 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.850688 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.868119 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.868164 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.868198 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.868220 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.868232 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:17Z","lastTransitionTime":"2025-12-08T08:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.872591 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.884281 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.897782 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.908687 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vlfn\" (UniqueName: \"kubernetes.io/projected/99143b9c-a541-4c0e-8387-0dff0d557974-kube-api-access-7vlfn\") pod \"network-metrics-daemon-kkhjg\" (UID: \"99143b9c-a541-4c0e-8387-0dff0d557974\") " pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.908744 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs\") pod \"network-metrics-daemon-kkhjg\" (UID: \"99143b9c-a541-4c0e-8387-0dff0d557974\") " pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.910602 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.930761 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"bj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721761 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721766 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1208 08:59:15.721771 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1208 08:59:15.721775 6173 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721624 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1208 08:59:15.721793 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1208 08:59:15.721797 6173 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1208 08:59:15.721633 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1208 08:59:15.721806 6173 ovn.go:134] Ensuring zone local for Pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.941556 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.954958 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.968860 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.970558 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.970627 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.970641 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.970665 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.970680 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:17Z","lastTransitionTime":"2025-12-08T08:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.980988 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:17 crc kubenswrapper[4776]: I1208 08:59:17.991299 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:17Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.003832 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:18Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.009643 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vlfn\" (UniqueName: \"kubernetes.io/projected/99143b9c-a541-4c0e-8387-0dff0d557974-kube-api-access-7vlfn\") pod \"network-metrics-daemon-kkhjg\" (UID: \"99143b9c-a541-4c0e-8387-0dff0d557974\") " pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.009711 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs\") pod \"network-metrics-daemon-kkhjg\" (UID: \"99143b9c-a541-4c0e-8387-0dff0d557974\") " pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:18 crc kubenswrapper[4776]: E1208 08:59:18.009845 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 08:59:18 crc kubenswrapper[4776]: E1208 08:59:18.009889 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs podName:99143b9c-a541-4c0e-8387-0dff0d557974 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:18.50987691 +0000 UTC m=+34.773101932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs") pod "network-metrics-daemon-kkhjg" (UID: "99143b9c-a541-4c0e-8387-0dff0d557974") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.019704 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:18Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.025017 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vlfn\" (UniqueName: \"kubernetes.io/projected/99143b9c-a541-4c0e-8387-0dff0d557974-kube-api-access-7vlfn\") pod \"network-metrics-daemon-kkhjg\" (UID: \"99143b9c-a541-4c0e-8387-0dff0d557974\") " pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.032430 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:18Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.048923 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:18Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.059793 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:18Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.072350 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:18Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.073366 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.073459 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.073485 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.073522 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.073552 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:18Z","lastTransitionTime":"2025-12-08T08:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.082565 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:18Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.106203 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:18Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.122463 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:18Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.140167 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:18Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.175901 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.175954 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.175965 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.175982 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.175996 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:18Z","lastTransitionTime":"2025-12-08T08:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.278292 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.278337 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.278346 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.278359 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.278368 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:18Z","lastTransitionTime":"2025-12-08T08:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.343320 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.343372 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:18 crc kubenswrapper[4776]: E1208 08:59:18.343522 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:18 crc kubenswrapper[4776]: E1208 08:59:18.343620 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.380946 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.381040 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.381064 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.381094 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.381117 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:18Z","lastTransitionTime":"2025-12-08T08:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.483020 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.483054 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.483069 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.483091 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.483102 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:18Z","lastTransitionTime":"2025-12-08T08:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.513581 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs\") pod \"network-metrics-daemon-kkhjg\" (UID: \"99143b9c-a541-4c0e-8387-0dff0d557974\") " pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:18 crc kubenswrapper[4776]: E1208 08:59:18.513765 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 08:59:18 crc kubenswrapper[4776]: E1208 08:59:18.513840 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs podName:99143b9c-a541-4c0e-8387-0dff0d557974 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:19.51382378 +0000 UTC m=+35.777048802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs") pod "network-metrics-daemon-kkhjg" (UID: "99143b9c-a541-4c0e-8387-0dff0d557974") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.585428 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.585491 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.585500 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.585515 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.585525 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:18Z","lastTransitionTime":"2025-12-08T08:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.630189 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" event={"ID":"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8","Type":"ContainerStarted","Data":"db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2"} Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.632024 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovnkube-controller/1.log" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.687923 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.688000 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.688034 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.688066 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.688086 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:18Z","lastTransitionTime":"2025-12-08T08:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.792791 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.792834 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.792846 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.792860 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.792874 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:18Z","lastTransitionTime":"2025-12-08T08:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.896201 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.896262 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.896275 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.896298 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.896312 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:18Z","lastTransitionTime":"2025-12-08T08:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.998678 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.998716 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.998725 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.998740 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:18 crc kubenswrapper[4776]: I1208 08:59:18.998751 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:18Z","lastTransitionTime":"2025-12-08T08:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.132221 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.132266 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.132275 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.132289 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.132299 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:19Z","lastTransitionTime":"2025-12-08T08:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.234731 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.234776 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.234789 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.234806 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.234819 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:19Z","lastTransitionTime":"2025-12-08T08:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.336517 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.336587 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.336599 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.336621 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.336635 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:19Z","lastTransitionTime":"2025-12-08T08:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.342769 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:19 crc kubenswrapper[4776]: E1208 08:59:19.342882 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.342778 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:19 crc kubenswrapper[4776]: E1208 08:59:19.342945 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.439038 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.439496 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.439539 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.439559 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.439571 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:19Z","lastTransitionTime":"2025-12-08T08:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.536026 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs\") pod \"network-metrics-daemon-kkhjg\" (UID: \"99143b9c-a541-4c0e-8387-0dff0d557974\") " pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:19 crc kubenswrapper[4776]: E1208 08:59:19.536226 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 08:59:19 crc kubenswrapper[4776]: E1208 08:59:19.536313 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs podName:99143b9c-a541-4c0e-8387-0dff0d557974 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:21.536294856 +0000 UTC m=+37.799519878 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs") pod "network-metrics-daemon-kkhjg" (UID: "99143b9c-a541-4c0e-8387-0dff0d557974") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.541977 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.542032 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.542045 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.542063 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.542075 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:19Z","lastTransitionTime":"2025-12-08T08:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.643623 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" event={"ID":"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8","Type":"ContainerStarted","Data":"46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f"} Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.644481 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.644546 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.644558 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.644579 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.644592 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:19Z","lastTransitionTime":"2025-12-08T08:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.662320 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.681054 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.713793 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"bj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721761 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721766 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1208 08:59:15.721771 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1208 08:59:15.721775 6173 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721624 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1208 08:59:15.721793 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1208 08:59:15.721797 6173 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1208 08:59:15.721633 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1208 08:59:15.721806 6173 ovn.go:134] Ensuring zone local for Pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.731305 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.747346 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.747395 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.747404 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.747426 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.747437 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:19Z","lastTransitionTime":"2025-12-08T08:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.750896 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.763950 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.777874 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.793072 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.809112 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.820990 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.832690 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.844236 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.849369 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.849443 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.849463 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.849489 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.849534 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:19Z","lastTransitionTime":"2025-12-08T08:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.858143 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.870142 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.889843 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.906418 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.921707 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:19Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.952162 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.952231 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.952245 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.952265 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:19 crc kubenswrapper[4776]: I1208 08:59:19.952280 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:19Z","lastTransitionTime":"2025-12-08T08:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.055502 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.055558 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.055569 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.055590 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.055604 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:20Z","lastTransitionTime":"2025-12-08T08:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.158648 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.159050 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.159258 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.159483 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.159628 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:20Z","lastTransitionTime":"2025-12-08T08:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.263728 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.264205 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.264345 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.264539 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.264664 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:20Z","lastTransitionTime":"2025-12-08T08:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.346105 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:20 crc kubenswrapper[4776]: E1208 08:59:20.346508 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.347315 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:20 crc kubenswrapper[4776]: E1208 08:59:20.347483 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.366965 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.367007 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.367019 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.367038 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.367050 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:20Z","lastTransitionTime":"2025-12-08T08:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.472340 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.472400 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.472420 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.472448 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.472469 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:20Z","lastTransitionTime":"2025-12-08T08:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.575436 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.575518 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.575542 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.575573 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.575601 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:20Z","lastTransitionTime":"2025-12-08T08:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.679983 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.680050 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.680068 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.680095 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.680114 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:20Z","lastTransitionTime":"2025-12-08T08:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.783733 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.783799 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.783816 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.783845 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.783864 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:20Z","lastTransitionTime":"2025-12-08T08:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.887574 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.888016 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.888232 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.888390 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.888532 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:20Z","lastTransitionTime":"2025-12-08T08:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.992403 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.992474 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.992492 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.992519 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:20 crc kubenswrapper[4776]: I1208 08:59:20.992540 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:20Z","lastTransitionTime":"2025-12-08T08:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.096043 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.096499 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.096604 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.096701 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.096795 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:21Z","lastTransitionTime":"2025-12-08T08:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.201520 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.201601 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.201620 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.201650 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.201671 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:21Z","lastTransitionTime":"2025-12-08T08:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.304965 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.305019 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.305031 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.305051 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.305065 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:21Z","lastTransitionTime":"2025-12-08T08:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.343351 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.343363 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:21 crc kubenswrapper[4776]: E1208 08:59:21.343527 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:21 crc kubenswrapper[4776]: E1208 08:59:21.343664 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.408061 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.408130 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.408145 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.408169 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.408217 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:21Z","lastTransitionTime":"2025-12-08T08:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.513874 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.514412 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.514571 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.514717 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.514844 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:21Z","lastTransitionTime":"2025-12-08T08:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.561719 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs\") pod \"network-metrics-daemon-kkhjg\" (UID: \"99143b9c-a541-4c0e-8387-0dff0d557974\") " pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:21 crc kubenswrapper[4776]: E1208 08:59:21.561960 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 08:59:21 crc kubenswrapper[4776]: E1208 08:59:21.562416 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs podName:99143b9c-a541-4c0e-8387-0dff0d557974 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:25.562386149 +0000 UTC m=+41.825611211 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs") pod "network-metrics-daemon-kkhjg" (UID: "99143b9c-a541-4c0e-8387-0dff0d557974") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.618627 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.618731 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.618793 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.618823 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.618882 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:21Z","lastTransitionTime":"2025-12-08T08:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.722676 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.722738 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.722760 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.722791 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.722814 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:21Z","lastTransitionTime":"2025-12-08T08:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.826509 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.826582 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.826600 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.826630 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.826650 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:21Z","lastTransitionTime":"2025-12-08T08:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.929992 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.930069 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.930087 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.930116 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:21 crc kubenswrapper[4776]: I1208 08:59:21.930136 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:21Z","lastTransitionTime":"2025-12-08T08:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.033643 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.033691 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.033705 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.033725 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.033738 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:22Z","lastTransitionTime":"2025-12-08T08:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.136896 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.136949 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.136958 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.136976 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.136986 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:22Z","lastTransitionTime":"2025-12-08T08:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.239650 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.239717 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.239736 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.239762 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.239779 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:22Z","lastTransitionTime":"2025-12-08T08:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.342832 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.342880 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:22 crc kubenswrapper[4776]: E1208 08:59:22.343040 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:22 crc kubenswrapper[4776]: E1208 08:59:22.343323 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.343977 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.344042 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.344062 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.344094 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.344123 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:22Z","lastTransitionTime":"2025-12-08T08:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.447135 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.447236 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.447251 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.447275 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.447289 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:22Z","lastTransitionTime":"2025-12-08T08:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.550867 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.550916 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.550928 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.550945 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.550956 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:22Z","lastTransitionTime":"2025-12-08T08:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.655339 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.655447 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.655464 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.655487 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.655506 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:22Z","lastTransitionTime":"2025-12-08T08:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.758393 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.758429 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.758438 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.758454 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.758464 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:22Z","lastTransitionTime":"2025-12-08T08:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.862000 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.862081 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.862101 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.862131 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.862152 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:22Z","lastTransitionTime":"2025-12-08T08:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.966069 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.966130 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.966146 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.966217 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:22 crc kubenswrapper[4776]: I1208 08:59:22.966256 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:22Z","lastTransitionTime":"2025-12-08T08:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.070861 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.070933 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.070951 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.070981 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.071004 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:23Z","lastTransitionTime":"2025-12-08T08:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.174245 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.174306 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.174323 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.174351 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.174372 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:23Z","lastTransitionTime":"2025-12-08T08:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.277508 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.277555 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.277564 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.277581 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.277591 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:23Z","lastTransitionTime":"2025-12-08T08:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.343296 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.343348 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:23 crc kubenswrapper[4776]: E1208 08:59:23.343586 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:23 crc kubenswrapper[4776]: E1208 08:59:23.343699 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.380722 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.380778 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.380787 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.380803 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.380830 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:23Z","lastTransitionTime":"2025-12-08T08:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.484223 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.484300 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.484322 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.484350 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.484371 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:23Z","lastTransitionTime":"2025-12-08T08:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.588594 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.588645 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.588659 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.588681 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.588696 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:23Z","lastTransitionTime":"2025-12-08T08:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.691780 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.691840 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.691852 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.691873 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.691887 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:23Z","lastTransitionTime":"2025-12-08T08:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.795782 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.795852 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.795866 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.795890 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.795908 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:23Z","lastTransitionTime":"2025-12-08T08:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.898664 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.898714 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.898730 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.898749 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:23 crc kubenswrapper[4776]: I1208 08:59:23.898766 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:23Z","lastTransitionTime":"2025-12-08T08:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.001035 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.001082 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.001093 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.001108 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.001119 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:24Z","lastTransitionTime":"2025-12-08T08:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.103839 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.103884 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.103892 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.103907 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.103917 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:24Z","lastTransitionTime":"2025-12-08T08:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.206553 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.206597 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.206606 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.206620 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.206629 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:24Z","lastTransitionTime":"2025-12-08T08:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.310151 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.310267 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.310283 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.310308 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.310326 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:24Z","lastTransitionTime":"2025-12-08T08:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.343029 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.343060 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:24 crc kubenswrapper[4776]: E1208 08:59:24.343165 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:24 crc kubenswrapper[4776]: E1208 08:59:24.343290 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.357186 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.372403 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.386488 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.403617 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.414681 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.414765 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.414793 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.414829 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.414854 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:24Z","lastTransitionTime":"2025-12-08T08:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.423060 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.438005 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.455256 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.483018 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.504354 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.519441 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.519573 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.519596 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.519630 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.519653 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:24Z","lastTransitionTime":"2025-12-08T08:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.525367 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.547503 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.564062 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.581474 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.600794 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.624337 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.624967 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.625471 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.626091 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.626132 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:24Z","lastTransitionTime":"2025-12-08T08:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.627363 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"bj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721761 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721766 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1208 08:59:15.721771 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1208 08:59:15.721775 6173 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721624 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1208 08:59:15.721793 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1208 08:59:15.721797 6173 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1208 08:59:15.721633 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1208 08:59:15.721806 6173 ovn.go:134] Ensuring zone local for Pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.645738 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.667349 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:24Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.730376 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.730519 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.730548 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.730631 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.730660 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:24Z","lastTransitionTime":"2025-12-08T08:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.833650 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.833747 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.833797 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.833824 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.833886 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:24Z","lastTransitionTime":"2025-12-08T08:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.937331 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.937450 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.937470 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.937549 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:24 crc kubenswrapper[4776]: I1208 08:59:24.937621 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:24Z","lastTransitionTime":"2025-12-08T08:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.040687 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.040776 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.040799 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.040826 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.040844 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:25Z","lastTransitionTime":"2025-12-08T08:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.145908 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.145958 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.145969 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.145991 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.146002 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:25Z","lastTransitionTime":"2025-12-08T08:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.255869 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.256749 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.256803 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.256838 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.256892 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:25Z","lastTransitionTime":"2025-12-08T08:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.343163 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.343351 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:25 crc kubenswrapper[4776]: E1208 08:59:25.343429 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:25 crc kubenswrapper[4776]: E1208 08:59:25.343625 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.360482 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.360518 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.360529 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.360544 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.360556 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:25Z","lastTransitionTime":"2025-12-08T08:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.464889 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.464954 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.464967 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.464990 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.465003 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:25Z","lastTransitionTime":"2025-12-08T08:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.568618 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.568691 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.568699 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.568712 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.568721 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:25Z","lastTransitionTime":"2025-12-08T08:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.612484 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs\") pod \"network-metrics-daemon-kkhjg\" (UID: \"99143b9c-a541-4c0e-8387-0dff0d557974\") " pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:25 crc kubenswrapper[4776]: E1208 08:59:25.612701 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 08:59:25 crc kubenswrapper[4776]: E1208 08:59:25.612798 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs podName:99143b9c-a541-4c0e-8387-0dff0d557974 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:33.61277813 +0000 UTC m=+49.876003152 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs") pod "network-metrics-daemon-kkhjg" (UID: "99143b9c-a541-4c0e-8387-0dff0d557974") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.671943 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.672007 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.672020 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.672060 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.672075 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:25Z","lastTransitionTime":"2025-12-08T08:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.774060 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.774101 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.774112 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.774127 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.774137 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:25Z","lastTransitionTime":"2025-12-08T08:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.878261 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.878325 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.878336 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.878370 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.878381 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:25Z","lastTransitionTime":"2025-12-08T08:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.883638 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.883715 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.883734 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.883765 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.883785 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:25Z","lastTransitionTime":"2025-12-08T08:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:25 crc kubenswrapper[4776]: E1208 08:59:25.904151 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:25Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.908262 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.908325 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.908341 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.908363 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.908379 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:25Z","lastTransitionTime":"2025-12-08T08:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:25 crc kubenswrapper[4776]: E1208 08:59:25.922738 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:25Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.926704 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.926875 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.926971 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.927062 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.927147 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:25Z","lastTransitionTime":"2025-12-08T08:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:25 crc kubenswrapper[4776]: E1208 08:59:25.943066 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:25Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.948542 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.948592 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.948602 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.948619 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.948629 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:25Z","lastTransitionTime":"2025-12-08T08:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:25 crc kubenswrapper[4776]: E1208 08:59:25.963818 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:25Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.966826 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.966928 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.966937 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.966972 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.966984 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:25Z","lastTransitionTime":"2025-12-08T08:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:25 crc kubenswrapper[4776]: E1208 08:59:25.983148 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:25Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:25 crc kubenswrapper[4776]: E1208 08:59:25.983441 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.985067 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.985120 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.985135 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.985156 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:25 crc kubenswrapper[4776]: I1208 08:59:25.985197 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:25Z","lastTransitionTime":"2025-12-08T08:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.088154 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.088783 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.088950 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.089109 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.089294 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:26Z","lastTransitionTime":"2025-12-08T08:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.192359 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.192408 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.192420 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.192447 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.192463 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:26Z","lastTransitionTime":"2025-12-08T08:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.295565 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.295618 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.295645 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.295677 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.295692 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:26Z","lastTransitionTime":"2025-12-08T08:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.343774 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:26 crc kubenswrapper[4776]: E1208 08:59:26.344027 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.344426 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:26 crc kubenswrapper[4776]: E1208 08:59:26.344631 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.399656 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.399708 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.399717 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.399734 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.399745 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:26Z","lastTransitionTime":"2025-12-08T08:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.502617 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.502698 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.502735 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.502757 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.502768 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:26Z","lastTransitionTime":"2025-12-08T08:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.605867 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.605910 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.605920 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.605937 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.605950 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:26Z","lastTransitionTime":"2025-12-08T08:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.712238 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.712299 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.712326 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.712348 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.712363 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:26Z","lastTransitionTime":"2025-12-08T08:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.814696 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.814930 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.814992 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.815098 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.815156 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:26Z","lastTransitionTime":"2025-12-08T08:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.919036 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.920308 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.920477 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.920641 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:26 crc kubenswrapper[4776]: I1208 08:59:26.920826 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:26Z","lastTransitionTime":"2025-12-08T08:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.025208 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.025648 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.025826 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.025979 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.026111 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:27Z","lastTransitionTime":"2025-12-08T08:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.130579 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.130627 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.130641 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.130663 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.130681 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:27Z","lastTransitionTime":"2025-12-08T08:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.234631 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.235401 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.235454 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.235490 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.235521 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:27Z","lastTransitionTime":"2025-12-08T08:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.339222 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.339655 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.339924 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.340128 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.340388 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:27Z","lastTransitionTime":"2025-12-08T08:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.342715 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.342715 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:27 crc kubenswrapper[4776]: E1208 08:59:27.342979 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:27 crc kubenswrapper[4776]: E1208 08:59:27.343146 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.444271 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.444346 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.444369 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.444397 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.444416 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:27Z","lastTransitionTime":"2025-12-08T08:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.548518 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.548593 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.548617 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.548653 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.548679 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:27Z","lastTransitionTime":"2025-12-08T08:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.652030 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.652101 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.652113 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.652138 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.652153 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:27Z","lastTransitionTime":"2025-12-08T08:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.756072 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.756118 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.756145 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.756163 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.756186 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:27Z","lastTransitionTime":"2025-12-08T08:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.859464 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.860103 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.860396 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.860630 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.860840 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:27Z","lastTransitionTime":"2025-12-08T08:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.964523 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.964570 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.964584 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.964601 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:27 crc kubenswrapper[4776]: I1208 08:59:27.964612 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:27Z","lastTransitionTime":"2025-12-08T08:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.067942 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.068004 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.068022 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.068044 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.068061 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:28Z","lastTransitionTime":"2025-12-08T08:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.172945 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.173620 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.173829 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.174038 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.174269 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:28Z","lastTransitionTime":"2025-12-08T08:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.277340 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.277395 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.277413 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.277433 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.277444 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:28Z","lastTransitionTime":"2025-12-08T08:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.343246 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:28 crc kubenswrapper[4776]: E1208 08:59:28.343770 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.343253 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:28 crc kubenswrapper[4776]: E1208 08:59:28.344035 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.379805 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.379855 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.379869 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.379891 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.379906 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:28Z","lastTransitionTime":"2025-12-08T08:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.483350 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.483817 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.484040 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.484250 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.484399 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:28Z","lastTransitionTime":"2025-12-08T08:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.588122 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.588167 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.588197 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.588215 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.588229 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:28Z","lastTransitionTime":"2025-12-08T08:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.690125 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.690188 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.690200 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.690217 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.690228 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:28Z","lastTransitionTime":"2025-12-08T08:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.792643 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.792679 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.792688 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.792701 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.792711 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:28Z","lastTransitionTime":"2025-12-08T08:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.895780 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.896025 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.896090 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.896158 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:28 crc kubenswrapper[4776]: I1208 08:59:28.896263 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:28Z","lastTransitionTime":"2025-12-08T08:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.001787 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.001845 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.001858 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.001876 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.001887 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:29Z","lastTransitionTime":"2025-12-08T08:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.105022 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.105088 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.105103 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.105128 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.105144 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:29Z","lastTransitionTime":"2025-12-08T08:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.208111 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.208156 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.208190 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.208210 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.208222 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:29Z","lastTransitionTime":"2025-12-08T08:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.310356 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.310388 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.310396 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.310411 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.310421 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:29Z","lastTransitionTime":"2025-12-08T08:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.342891 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.343013 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:29 crc kubenswrapper[4776]: E1208 08:59:29.343025 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:29 crc kubenswrapper[4776]: E1208 08:59:29.343287 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.413286 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.413327 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.413337 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.413353 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.413364 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:29Z","lastTransitionTime":"2025-12-08T08:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.516402 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.516463 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.516477 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.516493 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.516505 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:29Z","lastTransitionTime":"2025-12-08T08:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.619330 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.619377 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.619388 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.619401 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.619413 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:29Z","lastTransitionTime":"2025-12-08T08:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.722260 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.722304 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.722317 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.722332 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.722342 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:29Z","lastTransitionTime":"2025-12-08T08:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.825499 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.825607 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.825656 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.825679 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.825693 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:29Z","lastTransitionTime":"2025-12-08T08:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.929769 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.929827 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.929846 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.929870 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:29 crc kubenswrapper[4776]: I1208 08:59:29.929888 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:29Z","lastTransitionTime":"2025-12-08T08:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.033644 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.033708 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.033718 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.033734 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.033744 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:30Z","lastTransitionTime":"2025-12-08T08:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.138476 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.138562 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.138587 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.138620 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.138639 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:30Z","lastTransitionTime":"2025-12-08T08:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.241985 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.242051 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.242068 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.242097 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.242116 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:30Z","lastTransitionTime":"2025-12-08T08:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.342737 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.342771 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:30 crc kubenswrapper[4776]: E1208 08:59:30.342869 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:30 crc kubenswrapper[4776]: E1208 08:59:30.343083 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.344289 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.344400 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.344488 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.344566 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.344667 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:30Z","lastTransitionTime":"2025-12-08T08:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.446569 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.446676 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.446701 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.446720 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.446731 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:30Z","lastTransitionTime":"2025-12-08T08:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.549072 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.549149 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.549163 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.549202 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.549218 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:30Z","lastTransitionTime":"2025-12-08T08:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.654087 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.654145 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.654157 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.654191 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.654205 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:30Z","lastTransitionTime":"2025-12-08T08:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.756605 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.756646 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.756655 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.756670 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.756679 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:30Z","lastTransitionTime":"2025-12-08T08:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.859270 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.859333 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.859350 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.859373 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.859390 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:30Z","lastTransitionTime":"2025-12-08T08:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.963545 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.963616 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.963636 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.963664 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:30 crc kubenswrapper[4776]: I1208 08:59:30.963683 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:30Z","lastTransitionTime":"2025-12-08T08:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.066831 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.066874 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.066889 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.066907 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.066918 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:31Z","lastTransitionTime":"2025-12-08T08:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.172272 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.172347 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.172365 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.172395 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.172419 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:31Z","lastTransitionTime":"2025-12-08T08:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.276225 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.276291 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.276308 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.276334 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.276350 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:31Z","lastTransitionTime":"2025-12-08T08:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.342924 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.342924 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:31 crc kubenswrapper[4776]: E1208 08:59:31.343144 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:31 crc kubenswrapper[4776]: E1208 08:59:31.343288 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.379547 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.379614 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.379627 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.379648 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.379663 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:31Z","lastTransitionTime":"2025-12-08T08:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.463998 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.476573 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.483155 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.483262 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.483325 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.483352 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.483405 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:31Z","lastTransitionTime":"2025-12-08T08:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.485995 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.506019 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.521486 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.539468 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.556165 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.573191 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.590676 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.591277 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.591352 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.591372 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.591405 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.591424 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:31Z","lastTransitionTime":"2025-12-08T08:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.618551 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.641721 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.663219 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.685053 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.696923 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.696995 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.697013 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.697036 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.697051 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:31Z","lastTransitionTime":"2025-12-08T08:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.708887 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.737885 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"bj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721761 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721766 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1208 08:59:15.721771 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1208 08:59:15.721775 6173 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721624 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1208 08:59:15.721793 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1208 08:59:15.721797 6173 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1208 08:59:15.721633 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1208 08:59:15.721806 6173 ovn.go:134] Ensuring zone local for Pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.760871 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.782685 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.800808 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.800879 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.800894 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.800939 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.800954 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:31Z","lastTransitionTime":"2025-12-08T08:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.806523 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.821160 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:31Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.904010 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.904077 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.904095 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.904126 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:31 crc kubenswrapper[4776]: I1208 08:59:31.904145 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:31Z","lastTransitionTime":"2025-12-08T08:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.008453 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.008553 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.008584 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.008622 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.008647 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:32Z","lastTransitionTime":"2025-12-08T08:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.112413 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.112492 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.112508 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.112532 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.112548 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:32Z","lastTransitionTime":"2025-12-08T08:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.216292 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.216367 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.216387 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.216417 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.216435 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:32Z","lastTransitionTime":"2025-12-08T08:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.319497 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.319551 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.319562 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.319578 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.319591 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:32Z","lastTransitionTime":"2025-12-08T08:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.342940 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:32 crc kubenswrapper[4776]: E1208 08:59:32.343080 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.342938 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:32 crc kubenswrapper[4776]: E1208 08:59:32.343369 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.421515 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.421558 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.421569 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.421585 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.421596 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:32Z","lastTransitionTime":"2025-12-08T08:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.524283 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.524385 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.524409 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.524428 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.524443 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:32Z","lastTransitionTime":"2025-12-08T08:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.627693 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.627794 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.627875 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.627909 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.627951 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:32Z","lastTransitionTime":"2025-12-08T08:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.732097 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.732354 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.732375 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.732396 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.732410 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:32Z","lastTransitionTime":"2025-12-08T08:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.835759 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.835854 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.835868 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.835891 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.835907 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:32Z","lastTransitionTime":"2025-12-08T08:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.938947 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.939017 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.939038 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.939064 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:32 crc kubenswrapper[4776]: I1208 08:59:32.939082 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:32Z","lastTransitionTime":"2025-12-08T08:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.042288 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.042341 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.042354 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.042373 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.042384 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:33Z","lastTransitionTime":"2025-12-08T08:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.145234 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.145268 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.145278 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.145291 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.145304 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:33Z","lastTransitionTime":"2025-12-08T08:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.247818 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.247907 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.247921 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.247942 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.247956 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:33Z","lastTransitionTime":"2025-12-08T08:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.343262 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.343346 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.343627 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.343894 4776 scope.go:117] "RemoveContainer" containerID="61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f" Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.343884 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.350117 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.350353 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.350434 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.350538 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.350619 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:33Z","lastTransitionTime":"2025-12-08T08:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.453299 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.453336 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.453348 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.453367 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.453379 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:33Z","lastTransitionTime":"2025-12-08T08:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.556269 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.556327 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.556337 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.556360 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.556377 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:33Z","lastTransitionTime":"2025-12-08T08:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.659078 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.659112 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.659123 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.659139 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.659151 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:33Z","lastTransitionTime":"2025-12-08T08:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.705769 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs\") pod \"network-metrics-daemon-kkhjg\" (UID: \"99143b9c-a541-4c0e-8387-0dff0d557974\") " pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.705932 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.706000 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs podName:99143b9c-a541-4c0e-8387-0dff0d557974 nodeName:}" failed. No retries permitted until 2025-12-08 08:59:49.705981235 +0000 UTC m=+65.969206267 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs") pod "network-metrics-daemon-kkhjg" (UID: "99143b9c-a541-4c0e-8387-0dff0d557974") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.709396 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovnkube-controller/1.log" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.713133 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerStarted","Data":"46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf"} Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.715307 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.740450 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:33Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.753946 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:33Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.761823 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.761854 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.761862 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.761877 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.761892 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:33Z","lastTransitionTime":"2025-12-08T08:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.769589 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:33Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.790098 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:33Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.806400 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bd7d27-06e1-4574-8857-6adbe88633c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d32a10a86fe749d233a68a8e7583294e21c634dc47febe04e56220b591d505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c567d34bcaecb124f79504fee8f22c148f78bb039741a7b52883ab3188edaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://146374df9edb9e0092cf2e4cac4a5955d7d0980be93df8188f4b55ad12901572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:33Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.806565 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:00:05.806542493 +0000 UTC m=+82.069767515 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.806482 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.806755 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.806842 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.806903 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.806924 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.807007 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 09:00:05.806977764 +0000 UTC m=+82.070202986 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.807047 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.807059 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.807153 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 09:00:05.807128878 +0000 UTC m=+82.070354090 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.807068 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.807218 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.807253 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-08 09:00:05.807243971 +0000 UTC m=+82.070469183 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.823259 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:33Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.836386 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:33Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.847485 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:33Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.865164 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.865231 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.865240 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.865260 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.865272 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:33Z","lastTransitionTime":"2025-12-08T08:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.877407 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:33Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.891758 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:33Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.907648 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:33Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.908162 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.908354 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.908381 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.908393 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:33 crc kubenswrapper[4776]: E1208 08:59:33.908439 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-08 09:00:05.908423375 +0000 UTC m=+82.171648397 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.923036 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:33Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.937390 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:33Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.949806 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:33Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.967997 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.968041 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.968051 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.968070 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.968082 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:33Z","lastTransitionTime":"2025-12-08T08:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.983012 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:33Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:33 crc kubenswrapper[4776]: I1208 08:59:33.999795 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:33Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.026920 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.047576 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"bj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721761 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721766 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1208 08:59:15.721771 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1208 08:59:15.721775 6173 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721624 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1208 08:59:15.721793 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1208 08:59:15.721797 6173 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1208 08:59:15.721633 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1208 08:59:15.721806 6173 ovn.go:134] Ensuring zone local for Pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.070209 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.070267 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.070279 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.070301 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.070313 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:34Z","lastTransitionTime":"2025-12-08T08:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.173142 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.173229 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.173244 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.173266 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.173280 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:34Z","lastTransitionTime":"2025-12-08T08:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.276571 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.276625 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.276641 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.276659 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.276671 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:34Z","lastTransitionTime":"2025-12-08T08:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.343646 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.343768 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:34 crc kubenswrapper[4776]: E1208 08:59:34.343905 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:34 crc kubenswrapper[4776]: E1208 08:59:34.344017 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.371152 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.380367 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.380447 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.380458 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.380478 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.380496 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:34Z","lastTransitionTime":"2025-12-08T08:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.386498 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.401590 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.416749 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.434896 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bd7d27-06e1-4574-8857-6adbe88633c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d32a10a86fe749d233a68a8e7583294e21c634dc47febe04e56220b591d505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c567d34bcaecb124f79504fee8f22c148f78bb039741a7b52883ab3188edaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://146374df9edb9e0092cf2e4cac4a5955d7d0980be93df8188f4b55ad12901572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.449008 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.460929 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.471805 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.482657 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.482712 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.482728 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.482748 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.482763 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:34Z","lastTransitionTime":"2025-12-08T08:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.493012 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.509480 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.525428 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.537197 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.561653 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"bj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721761 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721766 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1208 08:59:15.721771 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1208 08:59:15.721775 6173 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721624 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1208 08:59:15.721793 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1208 08:59:15.721797 6173 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1208 08:59:15.721633 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1208 08:59:15.721806 6173 ovn.go:134] Ensuring zone local for Pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.574158 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.584700 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.584746 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.584756 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.584776 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.584788 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:34Z","lastTransitionTime":"2025-12-08T08:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.594815 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.612382 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.632536 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.646544 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.687485 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.687528 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.687545 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.687566 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.687582 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:34Z","lastTransitionTime":"2025-12-08T08:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.718922 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovnkube-controller/2.log" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.719872 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovnkube-controller/1.log" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.723426 4776 generic.go:334] "Generic (PLEG): container finished" podID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerID="46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf" exitCode=1 Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.723474 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerDied","Data":"46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf"} Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.723505 4776 scope.go:117] "RemoveContainer" containerID="61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.724624 4776 scope.go:117] "RemoveContainer" containerID="46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf" Dec 08 08:59:34 crc kubenswrapper[4776]: E1208 08:59:34.724953 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\"" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.744519 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.755028 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.768905 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.789462 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac2e26e184fc559a20772131feba0a567e2b6c5fb50046986ccad84a947c4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"message\\\":\\\"bj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721761 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721766 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1208 08:59:15.721771 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1208 08:59:15.721775 6173 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1208 08:59:15.721624 6173 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1208 08:59:15.721793 6173 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1208 08:59:15.721797 6173 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1208 08:59:15.721633 6173 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1208 08:59:15.721806 6173 ovn.go:134] Ensuring zone local for Pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:34Z\\\",\\\"message\\\":\\\"\\\\nI1208 08:59:34.244549 6400 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1208 08:59:34.244563 6400 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1208 08:59:34.244135 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z]\\\\nI1208 08:59:34.244585 6400 services_controller.go:451] Built service openshift-network-diagnostics/network-check-target cluster-wide \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.791012 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.791053 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.791065 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.791082 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.791093 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:34Z","lastTransitionTime":"2025-12-08T08:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.802084 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.817813 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.830058 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.846479 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.860796 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.875795 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.885996 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.893162 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.893223 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.893238 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.893259 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.893274 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:34Z","lastTransitionTime":"2025-12-08T08:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.898939 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bd7d27-06e1-4574-8857-6adbe88633c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d32a10a86fe749d233a68a8e7583294e21c634dc47febe04e56220b591d505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c567d34bcaecb124f79504fee8f22c148f78bb039741a7b52883ab3188edaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://146374df9edb9e0092cf2e4cac4a5955d7d0980be93df8188f4b55ad12901572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.908103 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.917896 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.927496 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.945065 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.957878 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.969075 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.994905 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.994933 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.994957 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.994971 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:34 crc kubenswrapper[4776]: I1208 08:59:34.994980 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:34Z","lastTransitionTime":"2025-12-08T08:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.096639 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.096672 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.096679 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.096692 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.096706 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:35Z","lastTransitionTime":"2025-12-08T08:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.198856 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.199195 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.199206 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.199220 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.199230 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:35Z","lastTransitionTime":"2025-12-08T08:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.301434 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.301497 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.301513 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.301541 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.301560 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:35Z","lastTransitionTime":"2025-12-08T08:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.342806 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.342896 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:35 crc kubenswrapper[4776]: E1208 08:59:35.342937 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:35 crc kubenswrapper[4776]: E1208 08:59:35.343054 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.403593 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.403630 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.403638 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.403652 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.403660 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:35Z","lastTransitionTime":"2025-12-08T08:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.506588 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.506625 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.506634 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.506648 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.506657 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:35Z","lastTransitionTime":"2025-12-08T08:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.609632 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.609667 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.609675 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.609690 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.609698 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:35Z","lastTransitionTime":"2025-12-08T08:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.711809 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.711847 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.711855 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.711867 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.711876 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:35Z","lastTransitionTime":"2025-12-08T08:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.728423 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovnkube-controller/2.log" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.732348 4776 scope.go:117] "RemoveContainer" containerID="46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf" Dec 08 08:59:35 crc kubenswrapper[4776]: E1208 08:59:35.732615 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\"" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.752221 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.763912 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.778553 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.795703 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bd7d27-06e1-4574-8857-6adbe88633c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d32a10a86fe749d233a68a8e7583294e21c634dc47febe04e56220b591d505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c567d34bcaecb124f79504fee8f22c148f78bb039741a7b52883ab3188edaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://146374df9edb9e0092cf2e4cac4a5955d7d0980be93df8188f4b55ad12901572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.810351 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.814681 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.814722 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.814731 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.814746 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.814757 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:35Z","lastTransitionTime":"2025-12-08T08:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.827215 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.854735 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.868738 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.884596 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.898857 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.912437 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.916536 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.916568 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.916576 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.916588 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.916599 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:35Z","lastTransitionTime":"2025-12-08T08:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.924685 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.941898 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.960465 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:34Z\\\",\\\"message\\\":\\\"\\\\nI1208 08:59:34.244549 6400 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1208 08:59:34.244563 6400 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1208 08:59:34.244135 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z]\\\\nI1208 08:59:34.244585 6400 services_controller.go:451] Built service openshift-network-diagnostics/network-check-target cluster-wide \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.972110 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.984384 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:35 crc kubenswrapper[4776]: I1208 08:59:35.995633 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:35Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.006499 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:36Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.019257 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.019317 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.019326 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.019338 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.019348 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:36Z","lastTransitionTime":"2025-12-08T08:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.121766 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.121805 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.121817 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.121832 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.121845 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:36Z","lastTransitionTime":"2025-12-08T08:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.224106 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.224150 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.224165 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.224197 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.224209 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:36Z","lastTransitionTime":"2025-12-08T08:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.326160 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.326221 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.326231 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.326249 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.326259 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:36Z","lastTransitionTime":"2025-12-08T08:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:36 crc kubenswrapper[4776]: E1208 08:59:36.337443 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:36Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.340824 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.340854 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.340863 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.340877 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.340887 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:36Z","lastTransitionTime":"2025-12-08T08:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.342814 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.342877 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:36 crc kubenswrapper[4776]: E1208 08:59:36.342916 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:36 crc kubenswrapper[4776]: E1208 08:59:36.343038 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:36 crc kubenswrapper[4776]: E1208 08:59:36.356272 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:36Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.359513 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.359562 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.359580 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.359599 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.359613 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:36Z","lastTransitionTime":"2025-12-08T08:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:36 crc kubenswrapper[4776]: E1208 08:59:36.373907 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:36Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.377564 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.377594 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.377603 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.377617 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.377628 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:36Z","lastTransitionTime":"2025-12-08T08:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:36 crc kubenswrapper[4776]: E1208 08:59:36.389290 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:36Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.392934 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.392979 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.392990 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.393006 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.393017 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:36Z","lastTransitionTime":"2025-12-08T08:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:36 crc kubenswrapper[4776]: E1208 08:59:36.411338 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:36Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:36 crc kubenswrapper[4776]: E1208 08:59:36.411457 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.413197 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.413240 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.413250 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.413263 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.413272 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:36Z","lastTransitionTime":"2025-12-08T08:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.516227 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.516287 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.516305 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.516329 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.516348 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:36Z","lastTransitionTime":"2025-12-08T08:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.619514 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.619563 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.619579 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.619603 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.619621 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:36Z","lastTransitionTime":"2025-12-08T08:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.722866 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.722963 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.722984 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.723009 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.723026 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:36Z","lastTransitionTime":"2025-12-08T08:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.826786 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.826858 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.826876 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.826900 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.826920 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:36Z","lastTransitionTime":"2025-12-08T08:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.930102 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.930167 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.930240 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.930273 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:36 crc kubenswrapper[4776]: I1208 08:59:36.930298 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:36Z","lastTransitionTime":"2025-12-08T08:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.033289 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.033356 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.033376 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.033403 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.033425 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:37Z","lastTransitionTime":"2025-12-08T08:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.138310 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.138344 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.138353 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.138368 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.138379 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:37Z","lastTransitionTime":"2025-12-08T08:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.240548 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.240624 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.240644 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.240666 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.240684 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:37Z","lastTransitionTime":"2025-12-08T08:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.342640 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:37 crc kubenswrapper[4776]: E1208 08:59:37.342904 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.342936 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.343011 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.343053 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.343069 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.343090 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.343104 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:37Z","lastTransitionTime":"2025-12-08T08:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:37 crc kubenswrapper[4776]: E1208 08:59:37.343268 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.445747 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.445808 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.445823 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.445844 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.445859 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:37Z","lastTransitionTime":"2025-12-08T08:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.548772 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.548821 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.548832 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.548848 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.548859 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:37Z","lastTransitionTime":"2025-12-08T08:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.652349 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.652392 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.652404 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.652420 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.652431 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:37Z","lastTransitionTime":"2025-12-08T08:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.754325 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.754415 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.754440 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.754471 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.754495 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:37Z","lastTransitionTime":"2025-12-08T08:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.856995 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.857063 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.857084 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.857116 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.857141 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:37Z","lastTransitionTime":"2025-12-08T08:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.960431 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.960638 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.960659 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.960685 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:37 crc kubenswrapper[4776]: I1208 08:59:37.960708 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:37Z","lastTransitionTime":"2025-12-08T08:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.063841 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.063926 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.063947 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.063969 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.063988 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:38Z","lastTransitionTime":"2025-12-08T08:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.168007 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.168202 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.168224 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.168278 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.168297 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:38Z","lastTransitionTime":"2025-12-08T08:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.271155 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.271347 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.271702 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.271761 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.271791 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:38Z","lastTransitionTime":"2025-12-08T08:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.343763 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.343812 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:38 crc kubenswrapper[4776]: E1208 08:59:38.343921 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:38 crc kubenswrapper[4776]: E1208 08:59:38.343985 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.374955 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.375001 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.375018 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.375043 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.375062 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:38Z","lastTransitionTime":"2025-12-08T08:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.478360 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.478433 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.478456 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.478487 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.478508 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:38Z","lastTransitionTime":"2025-12-08T08:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.582399 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.582460 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.582476 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.582499 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.582516 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:38Z","lastTransitionTime":"2025-12-08T08:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.685664 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.685726 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.685745 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.685766 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.685783 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:38Z","lastTransitionTime":"2025-12-08T08:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.789015 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.789079 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.789111 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.789158 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.789223 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:38Z","lastTransitionTime":"2025-12-08T08:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.892031 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.892135 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.892157 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.892217 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.892243 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:38Z","lastTransitionTime":"2025-12-08T08:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.994627 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.994705 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.994731 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.994762 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:38 crc kubenswrapper[4776]: I1208 08:59:38.994782 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:38Z","lastTransitionTime":"2025-12-08T08:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.098658 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.098729 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.098753 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.098781 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.098804 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:39Z","lastTransitionTime":"2025-12-08T08:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.201426 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.201466 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.201474 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.201486 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.201494 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:39Z","lastTransitionTime":"2025-12-08T08:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.305887 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.305943 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.305959 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.305982 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.306002 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:39Z","lastTransitionTime":"2025-12-08T08:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.343559 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.343589 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:39 crc kubenswrapper[4776]: E1208 08:59:39.343744 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:39 crc kubenswrapper[4776]: E1208 08:59:39.343898 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.408484 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.408554 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.408566 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.408589 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.408607 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:39Z","lastTransitionTime":"2025-12-08T08:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.511307 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.511374 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.511390 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.511418 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.511435 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:39Z","lastTransitionTime":"2025-12-08T08:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.614056 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.614119 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.614137 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.614161 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.614219 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:39Z","lastTransitionTime":"2025-12-08T08:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.717521 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.717570 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.717584 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.717604 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.717621 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:39Z","lastTransitionTime":"2025-12-08T08:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.821360 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.821427 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.821447 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.821474 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.821493 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:39Z","lastTransitionTime":"2025-12-08T08:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.924398 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.924460 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.924476 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.924499 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:39 crc kubenswrapper[4776]: I1208 08:59:39.924517 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:39Z","lastTransitionTime":"2025-12-08T08:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.027193 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.027218 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.027226 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.027239 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.027247 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:40Z","lastTransitionTime":"2025-12-08T08:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.131629 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.131695 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.131706 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.131723 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.131735 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:40Z","lastTransitionTime":"2025-12-08T08:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.235388 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.235470 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.235487 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.235963 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.236011 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:40Z","lastTransitionTime":"2025-12-08T08:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.340085 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.340220 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.340243 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.340268 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.340288 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:40Z","lastTransitionTime":"2025-12-08T08:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.343407 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.343418 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:40 crc kubenswrapper[4776]: E1208 08:59:40.343568 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:40 crc kubenswrapper[4776]: E1208 08:59:40.343827 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.443965 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.444040 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.444067 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.444102 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.444159 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:40Z","lastTransitionTime":"2025-12-08T08:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.547371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.547448 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.547474 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.547510 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.547535 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:40Z","lastTransitionTime":"2025-12-08T08:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.651018 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.651084 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.651101 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.651128 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.651151 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:40Z","lastTransitionTime":"2025-12-08T08:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.754467 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.754536 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.754553 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.754580 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.754601 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:40Z","lastTransitionTime":"2025-12-08T08:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.859975 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.860044 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.860068 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.860100 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.860168 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:40Z","lastTransitionTime":"2025-12-08T08:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.963663 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.963705 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.963716 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.963734 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:40 crc kubenswrapper[4776]: I1208 08:59:40.963774 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:40Z","lastTransitionTime":"2025-12-08T08:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.066363 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.066402 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.066413 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.066428 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.066440 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:41Z","lastTransitionTime":"2025-12-08T08:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.169576 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.169723 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.170090 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.170140 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.170159 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:41Z","lastTransitionTime":"2025-12-08T08:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.273342 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.273376 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.273384 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.273397 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.273406 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:41Z","lastTransitionTime":"2025-12-08T08:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.343301 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:41 crc kubenswrapper[4776]: E1208 08:59:41.343418 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.343542 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:41 crc kubenswrapper[4776]: E1208 08:59:41.345390 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.375916 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.375987 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.376004 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.376028 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.376046 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:41Z","lastTransitionTime":"2025-12-08T08:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.479145 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.479250 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.479275 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.479302 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.479325 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:41Z","lastTransitionTime":"2025-12-08T08:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.581635 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.581666 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.581674 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.581688 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.581697 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:41Z","lastTransitionTime":"2025-12-08T08:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.683966 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.684027 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.684050 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.684078 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.684098 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:41Z","lastTransitionTime":"2025-12-08T08:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.787846 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.787924 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.787955 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.787984 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.788005 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:41Z","lastTransitionTime":"2025-12-08T08:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.891148 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.891225 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.891240 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.891259 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.891273 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:41Z","lastTransitionTime":"2025-12-08T08:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.994655 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.994702 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.994722 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.994742 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:41 crc kubenswrapper[4776]: I1208 08:59:41.994754 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:41Z","lastTransitionTime":"2025-12-08T08:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.097662 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.097711 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.097724 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.097742 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.097758 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:42Z","lastTransitionTime":"2025-12-08T08:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.200235 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.200413 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.200626 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.200649 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.200661 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:42Z","lastTransitionTime":"2025-12-08T08:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.303711 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.303794 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.303819 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.303848 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.303868 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:42Z","lastTransitionTime":"2025-12-08T08:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.343433 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.343435 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:42 crc kubenswrapper[4776]: E1208 08:59:42.343619 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:42 crc kubenswrapper[4776]: E1208 08:59:42.343708 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.407160 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.407272 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.407292 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.407320 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.407338 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:42Z","lastTransitionTime":"2025-12-08T08:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.511560 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.511890 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.511930 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.512131 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.512493 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:42Z","lastTransitionTime":"2025-12-08T08:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.616666 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.616715 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.616725 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.616744 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.616756 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:42Z","lastTransitionTime":"2025-12-08T08:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.719682 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.719748 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.719769 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.719796 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.719815 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:42Z","lastTransitionTime":"2025-12-08T08:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.824032 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.824097 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.824114 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.824140 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.824161 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:42Z","lastTransitionTime":"2025-12-08T08:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.927472 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.927553 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.927572 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.927609 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:42 crc kubenswrapper[4776]: I1208 08:59:42.927632 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:42Z","lastTransitionTime":"2025-12-08T08:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.030776 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.030867 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.030886 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.030917 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.030939 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:43Z","lastTransitionTime":"2025-12-08T08:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.134693 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.134770 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.134791 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.134821 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.134840 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:43Z","lastTransitionTime":"2025-12-08T08:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.237678 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.237744 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.237763 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.237791 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.237813 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:43Z","lastTransitionTime":"2025-12-08T08:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.341420 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.341518 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.341547 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.341582 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.341608 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:43Z","lastTransitionTime":"2025-12-08T08:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.342887 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.342941 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:43 crc kubenswrapper[4776]: E1208 08:59:43.343142 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:43 crc kubenswrapper[4776]: E1208 08:59:43.343263 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.445260 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.445347 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.445367 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.445397 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.445418 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:43Z","lastTransitionTime":"2025-12-08T08:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.550150 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.550257 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.550278 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.550308 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.550328 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:43Z","lastTransitionTime":"2025-12-08T08:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.654534 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.654594 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.654606 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.654626 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.654643 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:43Z","lastTransitionTime":"2025-12-08T08:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.757679 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.757735 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.757781 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.757810 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.757834 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:43Z","lastTransitionTime":"2025-12-08T08:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.861587 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.861646 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.861666 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.861695 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.861721 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:43Z","lastTransitionTime":"2025-12-08T08:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.971123 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.971242 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.971368 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.971447 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:43 crc kubenswrapper[4776]: I1208 08:59:43.971488 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:43Z","lastTransitionTime":"2025-12-08T08:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.074327 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.074386 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.074403 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.074428 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.074447 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:44Z","lastTransitionTime":"2025-12-08T08:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.176704 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.176792 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.176818 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.176867 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.176896 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:44Z","lastTransitionTime":"2025-12-08T08:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.280344 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.280384 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.280393 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.280408 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.280420 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:44Z","lastTransitionTime":"2025-12-08T08:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.343367 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:44 crc kubenswrapper[4776]: E1208 08:59:44.343544 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.343611 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:44 crc kubenswrapper[4776]: E1208 08:59:44.343729 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.356058 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bd7d27-06e1-4574-8857-6adbe88633c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d32a10a86fe749d233a68a8e7583294e21c634dc47febe04e56220b591d505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c567d34bcaecb124f79504fee8f22c148f78bb039741a7b52883ab3188edaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://146374df9edb9e0092cf2e4cac4a5955d7d0980be93df8188f4b55ad12901572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.366470 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.377331 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.385584 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.385630 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.385640 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.385660 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.385674 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:44Z","lastTransitionTime":"2025-12-08T08:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.387270 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.406215 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.419700 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.438140 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.459196 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:34Z\\\",\\\"message\\\":\\\"\\\\nI1208 08:59:34.244549 6400 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1208 08:59:34.244563 6400 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1208 08:59:34.244135 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z]\\\\nI1208 08:59:34.244585 6400 services_controller.go:451] Built service openshift-network-diagnostics/network-check-target cluster-wide \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.471954 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.487674 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.487703 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.487710 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.487725 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.487733 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:44Z","lastTransitionTime":"2025-12-08T08:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.490811 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.504709 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.519034 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.529714 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.544709 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.557396 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.569608 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.580120 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.590315 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:44Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.590880 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.590992 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.591004 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.591019 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.591029 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:44Z","lastTransitionTime":"2025-12-08T08:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.693434 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.693475 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.693483 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.693497 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.693507 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:44Z","lastTransitionTime":"2025-12-08T08:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.796061 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.796114 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.796127 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.796145 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.796159 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:44Z","lastTransitionTime":"2025-12-08T08:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.898936 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.898989 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.899001 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.899018 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:44 crc kubenswrapper[4776]: I1208 08:59:44.899028 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:44Z","lastTransitionTime":"2025-12-08T08:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.002265 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.002297 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.002307 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.002319 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.002327 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:45Z","lastTransitionTime":"2025-12-08T08:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.106447 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.106518 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.106532 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.106579 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.106599 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:45Z","lastTransitionTime":"2025-12-08T08:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.209057 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.209096 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.209105 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.209119 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.209129 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:45Z","lastTransitionTime":"2025-12-08T08:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.311361 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.311395 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.311404 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.311420 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.311429 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:45Z","lastTransitionTime":"2025-12-08T08:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.342596 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:45 crc kubenswrapper[4776]: E1208 08:59:45.342710 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.342596 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:45 crc kubenswrapper[4776]: E1208 08:59:45.342767 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.413886 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.413924 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.413936 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.413953 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.413964 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:45Z","lastTransitionTime":"2025-12-08T08:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.516734 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.516788 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.516800 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.516818 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.516830 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:45Z","lastTransitionTime":"2025-12-08T08:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.619322 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.619421 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.619439 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.619459 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.619473 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:45Z","lastTransitionTime":"2025-12-08T08:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.722391 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.722438 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.722450 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.722466 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.722477 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:45Z","lastTransitionTime":"2025-12-08T08:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.825290 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.825332 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.825357 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.825377 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.825387 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:45Z","lastTransitionTime":"2025-12-08T08:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.927560 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.927596 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.927608 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.927627 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:45 crc kubenswrapper[4776]: I1208 08:59:45.927639 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:45Z","lastTransitionTime":"2025-12-08T08:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.030119 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.030159 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.030200 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.030216 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.030227 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:46Z","lastTransitionTime":"2025-12-08T08:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.132535 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.132582 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.132600 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.132621 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.132637 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:46Z","lastTransitionTime":"2025-12-08T08:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.235487 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.235530 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.235539 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.235554 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.235563 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:46Z","lastTransitionTime":"2025-12-08T08:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.338277 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.338350 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.338376 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.338406 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.338432 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:46Z","lastTransitionTime":"2025-12-08T08:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.342659 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:46 crc kubenswrapper[4776]: E1208 08:59:46.342808 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.343545 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:46 crc kubenswrapper[4776]: E1208 08:59:46.343708 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.440760 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.440835 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.440864 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.440902 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.440920 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:46Z","lastTransitionTime":"2025-12-08T08:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.544588 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.544658 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.544680 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.544710 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.544733 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:46Z","lastTransitionTime":"2025-12-08T08:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.646652 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.646713 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.646735 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.646756 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.646772 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:46Z","lastTransitionTime":"2025-12-08T08:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.693053 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.693135 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.693158 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.693234 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.693254 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:46Z","lastTransitionTime":"2025-12-08T08:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:46 crc kubenswrapper[4776]: E1208 08:59:46.709030 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:46Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.712669 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.712706 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.712717 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.712732 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.712745 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:46Z","lastTransitionTime":"2025-12-08T08:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:46 crc kubenswrapper[4776]: E1208 08:59:46.728008 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:46Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.732525 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.732587 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.732598 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.732615 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.732628 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:46Z","lastTransitionTime":"2025-12-08T08:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:46 crc kubenswrapper[4776]: E1208 08:59:46.751133 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:46Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.755206 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.755248 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.755260 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.755278 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.755291 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:46Z","lastTransitionTime":"2025-12-08T08:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:46 crc kubenswrapper[4776]: E1208 08:59:46.768913 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:46Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.771922 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.771952 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.771960 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.771972 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.771980 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:46Z","lastTransitionTime":"2025-12-08T08:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:46 crc kubenswrapper[4776]: E1208 08:59:46.785964 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:46Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:46 crc kubenswrapper[4776]: E1208 08:59:46.786164 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.787733 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.787779 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.787795 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.787814 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.787825 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:46Z","lastTransitionTime":"2025-12-08T08:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.890664 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.890696 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.890708 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.890724 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.890735 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:46Z","lastTransitionTime":"2025-12-08T08:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.992644 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.992686 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.992698 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.992713 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:46 crc kubenswrapper[4776]: I1208 08:59:46.992754 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:46Z","lastTransitionTime":"2025-12-08T08:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.094696 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.094727 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.094736 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.094750 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.094760 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:47Z","lastTransitionTime":"2025-12-08T08:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.197647 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.197716 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.197737 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.197765 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.197789 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:47Z","lastTransitionTime":"2025-12-08T08:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.301465 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.301526 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.301552 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.301658 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.301685 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:47Z","lastTransitionTime":"2025-12-08T08:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.343117 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.343357 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:47 crc kubenswrapper[4776]: E1208 08:59:47.343474 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:47 crc kubenswrapper[4776]: E1208 08:59:47.343607 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.405468 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.405546 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.405604 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.405637 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.405658 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:47Z","lastTransitionTime":"2025-12-08T08:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.509071 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.509145 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.509223 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.509258 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.509288 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:47Z","lastTransitionTime":"2025-12-08T08:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.612520 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.612595 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.612620 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.612654 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.612672 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:47Z","lastTransitionTime":"2025-12-08T08:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.716665 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.716706 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.716715 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.716732 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.716746 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:47Z","lastTransitionTime":"2025-12-08T08:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.820030 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.820430 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.820526 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.820640 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.820735 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:47Z","lastTransitionTime":"2025-12-08T08:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.923766 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.923812 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.923824 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.923840 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:47 crc kubenswrapper[4776]: I1208 08:59:47.923851 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:47Z","lastTransitionTime":"2025-12-08T08:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.026712 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.026748 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.026757 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.026772 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.026784 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:48Z","lastTransitionTime":"2025-12-08T08:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.130128 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.130192 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.130202 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.130217 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.130229 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:48Z","lastTransitionTime":"2025-12-08T08:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.233223 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.233269 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.233282 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.233303 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.233316 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:48Z","lastTransitionTime":"2025-12-08T08:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.335850 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.335904 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.335913 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.335927 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.335937 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:48Z","lastTransitionTime":"2025-12-08T08:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.343233 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.343241 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:48 crc kubenswrapper[4776]: E1208 08:59:48.343480 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:48 crc kubenswrapper[4776]: E1208 08:59:48.343586 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.439338 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.439402 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.439412 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.439429 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.439440 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:48Z","lastTransitionTime":"2025-12-08T08:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.541697 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.541734 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.541748 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.541768 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.541779 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:48Z","lastTransitionTime":"2025-12-08T08:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.644085 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.644125 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.644138 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.644154 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.644166 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:48Z","lastTransitionTime":"2025-12-08T08:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.746530 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.746589 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.746602 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.746623 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.746636 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:48Z","lastTransitionTime":"2025-12-08T08:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.849733 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.849781 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.849793 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.849811 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.849822 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:48Z","lastTransitionTime":"2025-12-08T08:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.952464 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.952502 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.952512 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.952528 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:48 crc kubenswrapper[4776]: I1208 08:59:48.952538 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:48Z","lastTransitionTime":"2025-12-08T08:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.055499 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.055552 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.055565 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.055583 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.055594 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:49Z","lastTransitionTime":"2025-12-08T08:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.158570 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.158628 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.158641 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.158662 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.158674 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:49Z","lastTransitionTime":"2025-12-08T08:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.261729 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.261813 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.261836 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.261863 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.261883 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:49Z","lastTransitionTime":"2025-12-08T08:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.342962 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.342962 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:49 crc kubenswrapper[4776]: E1208 08:59:49.343268 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:49 crc kubenswrapper[4776]: E1208 08:59:49.343315 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.365653 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.365820 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.365850 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.365880 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.365902 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:49Z","lastTransitionTime":"2025-12-08T08:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.469123 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.469249 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.469269 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.469295 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.469314 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:49Z","lastTransitionTime":"2025-12-08T08:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.573058 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.573131 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.573158 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.573226 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.573247 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:49Z","lastTransitionTime":"2025-12-08T08:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.677196 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.677260 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.677280 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.677311 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.677334 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:49Z","lastTransitionTime":"2025-12-08T08:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.779828 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.779914 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.779933 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.779964 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.779988 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:49Z","lastTransitionTime":"2025-12-08T08:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.799476 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs\") pod \"network-metrics-daemon-kkhjg\" (UID: \"99143b9c-a541-4c0e-8387-0dff0d557974\") " pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:49 crc kubenswrapper[4776]: E1208 08:59:49.799894 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 08:59:49 crc kubenswrapper[4776]: E1208 08:59:49.800219 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs podName:99143b9c-a541-4c0e-8387-0dff0d557974 nodeName:}" failed. No retries permitted until 2025-12-08 09:00:21.800146056 +0000 UTC m=+98.063371118 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs") pod "network-metrics-daemon-kkhjg" (UID: "99143b9c-a541-4c0e-8387-0dff0d557974") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.883504 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.883578 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.883602 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.883633 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.883656 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:49Z","lastTransitionTime":"2025-12-08T08:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.987230 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.987327 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.987348 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.987376 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:49 crc kubenswrapper[4776]: I1208 08:59:49.987397 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:49Z","lastTransitionTime":"2025-12-08T08:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.090783 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.090869 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.090901 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.090936 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.090959 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:50Z","lastTransitionTime":"2025-12-08T08:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.194282 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.194331 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.194339 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.194354 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.194366 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:50Z","lastTransitionTime":"2025-12-08T08:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.298096 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.298201 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.298221 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.298245 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.298262 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:50Z","lastTransitionTime":"2025-12-08T08:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.343221 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.343281 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:50 crc kubenswrapper[4776]: E1208 08:59:50.343376 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:50 crc kubenswrapper[4776]: E1208 08:59:50.343569 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.345383 4776 scope.go:117] "RemoveContainer" containerID="46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf" Dec 08 08:59:50 crc kubenswrapper[4776]: E1208 08:59:50.345934 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\"" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.401287 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.401327 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.401336 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.401352 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.401361 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:50Z","lastTransitionTime":"2025-12-08T08:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.504239 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.504270 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.504279 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.504292 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.504301 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:50Z","lastTransitionTime":"2025-12-08T08:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.607268 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.607339 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.607355 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.607379 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.607395 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:50Z","lastTransitionTime":"2025-12-08T08:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.709885 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.709929 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.709940 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.709957 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.709969 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:50Z","lastTransitionTime":"2025-12-08T08:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.811791 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.811818 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.811827 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.811843 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.811864 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:50Z","lastTransitionTime":"2025-12-08T08:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.913902 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.913941 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.913951 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.913968 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:50 crc kubenswrapper[4776]: I1208 08:59:50.913981 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:50Z","lastTransitionTime":"2025-12-08T08:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.016235 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.016268 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.016278 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.016292 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.016304 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:51Z","lastTransitionTime":"2025-12-08T08:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.118658 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.118915 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.118927 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.118946 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.118957 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:51Z","lastTransitionTime":"2025-12-08T08:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.222372 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.222407 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.222415 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.222429 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.222439 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:51Z","lastTransitionTime":"2025-12-08T08:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.324449 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.324499 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.324526 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.324544 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.324554 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:51Z","lastTransitionTime":"2025-12-08T08:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.342726 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.342726 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:51 crc kubenswrapper[4776]: E1208 08:59:51.342848 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:51 crc kubenswrapper[4776]: E1208 08:59:51.342951 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.427426 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.427462 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.427473 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.427489 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.427500 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:51Z","lastTransitionTime":"2025-12-08T08:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.530161 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.530254 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.530271 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.530296 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.530316 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:51Z","lastTransitionTime":"2025-12-08T08:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.632847 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.632880 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.632889 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.632902 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.632916 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:51Z","lastTransitionTime":"2025-12-08T08:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.735222 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.735273 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.735284 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.735301 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.735312 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:51Z","lastTransitionTime":"2025-12-08T08:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.838421 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.838495 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.838514 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.838545 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.838564 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:51Z","lastTransitionTime":"2025-12-08T08:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.942201 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.942251 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.942261 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.942280 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:51 crc kubenswrapper[4776]: I1208 08:59:51.942295 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:51Z","lastTransitionTime":"2025-12-08T08:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.046008 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.046059 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.046069 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.046086 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.046096 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:52Z","lastTransitionTime":"2025-12-08T08:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.149784 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.149880 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.149910 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.149942 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.149961 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:52Z","lastTransitionTime":"2025-12-08T08:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.252580 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.252628 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.252662 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.252680 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.252691 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:52Z","lastTransitionTime":"2025-12-08T08:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.343444 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.343517 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:52 crc kubenswrapper[4776]: E1208 08:59:52.343650 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:52 crc kubenswrapper[4776]: E1208 08:59:52.343867 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.355383 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.355433 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.355443 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.355457 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.355466 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:52Z","lastTransitionTime":"2025-12-08T08:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.458129 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.458209 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.458223 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.458241 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.458253 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:52Z","lastTransitionTime":"2025-12-08T08:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.560604 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.560643 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.560653 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.560668 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.560682 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:52Z","lastTransitionTime":"2025-12-08T08:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.663010 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.663082 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.663101 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.663131 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.663155 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:52Z","lastTransitionTime":"2025-12-08T08:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.766055 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.766096 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.766111 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.766125 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.766136 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:52Z","lastTransitionTime":"2025-12-08T08:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.796836 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-555j6_775b9e97-3ad5-4003-a2c2-fc8dd58b69cc/kube-multus/0.log" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.796894 4776 generic.go:334] "Generic (PLEG): container finished" podID="775b9e97-3ad5-4003-a2c2-fc8dd58b69cc" containerID="04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf" exitCode=1 Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.796925 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-555j6" event={"ID":"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc","Type":"ContainerDied","Data":"04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf"} Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.797327 4776 scope.go:117] "RemoveContainer" containerID="04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.813517 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:52Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.829970 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:52Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.843320 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:52Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.857114 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:52Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.870115 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.870238 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.870256 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.870277 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.870289 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:52Z","lastTransitionTime":"2025-12-08T08:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.873210 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bd7d27-06e1-4574-8857-6adbe88633c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d32a10a86fe749d233a68a8e7583294e21c634dc47febe04e56220b591d505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c567d34bcaecb124f79504fee8f22c148f78bb039741a7b52883ab3188edaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://146374df9edb9e0092cf2e4cac4a5955d7d0980be93df8188f4b55ad12901572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:52Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.885800 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:52Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.899432 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:52Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.911555 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:52Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.932976 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:52Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.947322 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:52Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.960987 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:52Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.973243 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.973295 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.973308 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.973325 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.973336 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:52Z","lastTransitionTime":"2025-12-08T08:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.974633 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:52Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.987081 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:52Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:52 crc kubenswrapper[4776]: I1208 08:59:52.999797 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:52Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.012163 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.023815 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.038722 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:52Z\\\",\\\"message\\\":\\\"2025-12-08T08:59:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a3a44f90-75bc-4af4-8170-8121c4c73300\\\\n2025-12-08T08:59:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a3a44f90-75bc-4af4-8170-8121c4c73300 to /host/opt/cni/bin/\\\\n2025-12-08T08:59:05Z [verbose] multus-daemon started\\\\n2025-12-08T08:59:05Z [verbose] Readiness Indicator file check\\\\n2025-12-08T08:59:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.057901 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:34Z\\\",\\\"message\\\":\\\"\\\\nI1208 08:59:34.244549 6400 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1208 08:59:34.244563 6400 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1208 08:59:34.244135 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z]\\\\nI1208 08:59:34.244585 6400 services_controller.go:451] Built service openshift-network-diagnostics/network-check-target cluster-wide \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.075190 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.075219 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.075229 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.075242 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.075252 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:53Z","lastTransitionTime":"2025-12-08T08:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.177547 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.177611 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.177629 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.177658 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.177680 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:53Z","lastTransitionTime":"2025-12-08T08:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.281204 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.281254 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.281282 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.281301 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.281312 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:53Z","lastTransitionTime":"2025-12-08T08:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.343109 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.343286 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:53 crc kubenswrapper[4776]: E1208 08:59:53.343474 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:53 crc kubenswrapper[4776]: E1208 08:59:53.343321 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.384129 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.384205 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.384220 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.384240 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.384251 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:53Z","lastTransitionTime":"2025-12-08T08:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.487253 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.487307 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.487324 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.487347 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.487365 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:53Z","lastTransitionTime":"2025-12-08T08:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.589671 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.589723 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.589741 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.589762 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.589780 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:53Z","lastTransitionTime":"2025-12-08T08:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.692331 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.692376 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.692388 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.692405 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.692417 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:53Z","lastTransitionTime":"2025-12-08T08:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.795657 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.795700 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.795710 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.795727 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.795738 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:53Z","lastTransitionTime":"2025-12-08T08:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.801922 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-555j6_775b9e97-3ad5-4003-a2c2-fc8dd58b69cc/kube-multus/0.log" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.801991 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-555j6" event={"ID":"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc","Type":"ContainerStarted","Data":"bf6eb111dbbc6dec73baadf4e88ff08f03050658b7682b28c960ecdb80973eae"} Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.813217 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.822028 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.835547 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.850679 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.861314 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.870487 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.882602 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bd7d27-06e1-4574-8857-6adbe88633c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d32a10a86fe749d233a68a8e7583294e21c634dc47febe04e56220b591d505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c567d34bcaecb124f79504fee8f22c148f78bb039741a7b52883ab3188edaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://146374df9edb9e0092cf2e4cac4a5955d7d0980be93df8188f4b55ad12901572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.892715 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.897585 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.897611 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.897621 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.897636 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.897649 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:53Z","lastTransitionTime":"2025-12-08T08:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.908680 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.924707 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.935034 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.948618 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.960058 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.972465 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.982432 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:53 crc kubenswrapper[4776]: I1208 08:59:53.996589 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6eb111dbbc6dec73baadf4e88ff08f03050658b7682b28c960ecdb80973eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:52Z\\\",\\\"message\\\":\\\"2025-12-08T08:59:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a3a44f90-75bc-4af4-8170-8121c4c73300\\\\n2025-12-08T08:59:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a3a44f90-75bc-4af4-8170-8121c4c73300 to /host/opt/cni/bin/\\\\n2025-12-08T08:59:05Z [verbose] multus-daemon started\\\\n2025-12-08T08:59:05Z [verbose] Readiness Indicator file check\\\\n2025-12-08T08:59:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:53Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.000522 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.000566 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.000575 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.000589 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.000623 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:54Z","lastTransitionTime":"2025-12-08T08:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.022677 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:34Z\\\",\\\"message\\\":\\\"\\\\nI1208 08:59:34.244549 6400 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1208 08:59:34.244563 6400 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1208 08:59:34.244135 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z]\\\\nI1208 08:59:34.244585 6400 services_controller.go:451] Built service openshift-network-diagnostics/network-check-target cluster-wide \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.039057 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.103451 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.103505 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.103522 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.103547 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.103563 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:54Z","lastTransitionTime":"2025-12-08T08:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.205917 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.205970 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.205980 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.205995 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.206005 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:54Z","lastTransitionTime":"2025-12-08T08:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.308807 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.308849 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.308858 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.308876 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.308887 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:54Z","lastTransitionTime":"2025-12-08T08:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.343163 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.343274 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:54 crc kubenswrapper[4776]: E1208 08:59:54.343292 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:54 crc kubenswrapper[4776]: E1208 08:59:54.343460 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.362880 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.378691 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.392908 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6eb111dbbc6dec73baadf4e88ff08f03050658b7682b28c960ecdb80973eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:52Z\\\",\\\"message\\\":\\\"2025-12-08T08:59:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a3a44f90-75bc-4af4-8170-8121c4c73300\\\\n2025-12-08T08:59:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a3a44f90-75bc-4af4-8170-8121c4c73300 to /host/opt/cni/bin/\\\\n2025-12-08T08:59:05Z [verbose] multus-daemon started\\\\n2025-12-08T08:59:05Z [verbose] Readiness Indicator file check\\\\n2025-12-08T08:59:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.411633 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.411672 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.411683 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.411700 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.411712 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:54Z","lastTransitionTime":"2025-12-08T08:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.414598 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:34Z\\\",\\\"message\\\":\\\"\\\\nI1208 08:59:34.244549 6400 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1208 08:59:34.244563 6400 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1208 08:59:34.244135 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z]\\\\nI1208 08:59:34.244585 6400 services_controller.go:451] Built service openshift-network-diagnostics/network-check-target cluster-wide \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.427932 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.439874 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.454568 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.468775 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.483600 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.500554 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.513582 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.514224 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.514261 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.514276 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.514298 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.514314 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:54Z","lastTransitionTime":"2025-12-08T08:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.529514 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bd7d27-06e1-4574-8857-6adbe88633c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d32a10a86fe749d233a68a8e7583294e21c634dc47febe04e56220b591d505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c567d34bcaecb124f79504fee8f22c148f78bb039741a7b52883ab3188edaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://146374df9edb9e0092cf2e4cac4a5955d7d0980be93df8188f4b55ad12901572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.539919 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.553350 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.564212 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.582441 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.596653 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.610428 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:54Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.617285 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.617321 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.617332 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.617352 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.617364 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:54Z","lastTransitionTime":"2025-12-08T08:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.719552 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.719598 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.719609 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.719625 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.719637 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:54Z","lastTransitionTime":"2025-12-08T08:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.822412 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.822450 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.822461 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.822475 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.822485 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:54Z","lastTransitionTime":"2025-12-08T08:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.952580 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.952616 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.952656 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.952673 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:54 crc kubenswrapper[4776]: I1208 08:59:54.952686 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:54Z","lastTransitionTime":"2025-12-08T08:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.055243 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.055286 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.055294 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.055309 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.055323 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:55Z","lastTransitionTime":"2025-12-08T08:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.157481 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.157518 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.157526 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.157541 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.157550 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:55Z","lastTransitionTime":"2025-12-08T08:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.260383 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.260418 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.260429 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.260446 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.260457 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:55Z","lastTransitionTime":"2025-12-08T08:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.342936 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:55 crc kubenswrapper[4776]: E1208 08:59:55.343154 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.342948 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:55 crc kubenswrapper[4776]: E1208 08:59:55.343338 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.362787 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.362835 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.362847 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.362868 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.362882 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:55Z","lastTransitionTime":"2025-12-08T08:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.465329 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.465408 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.465429 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.465454 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.465473 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:55Z","lastTransitionTime":"2025-12-08T08:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.567578 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.567623 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.567636 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.567655 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.567665 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:55Z","lastTransitionTime":"2025-12-08T08:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.670645 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.670685 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.670696 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.670713 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.670727 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:55Z","lastTransitionTime":"2025-12-08T08:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.773772 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.773815 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.773831 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.773848 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.773861 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:55Z","lastTransitionTime":"2025-12-08T08:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.876853 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.876956 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.876975 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.877029 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.877052 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:55Z","lastTransitionTime":"2025-12-08T08:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.979876 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.979904 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.979911 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.979924 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:55 crc kubenswrapper[4776]: I1208 08:59:55.979933 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:55Z","lastTransitionTime":"2025-12-08T08:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.083037 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.083108 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.083128 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.083161 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.083211 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:56Z","lastTransitionTime":"2025-12-08T08:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.191750 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.191799 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.191809 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.191826 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.191837 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:56Z","lastTransitionTime":"2025-12-08T08:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.294387 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.294448 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.294460 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.294483 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.294495 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:56Z","lastTransitionTime":"2025-12-08T08:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.342722 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.342827 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:56 crc kubenswrapper[4776]: E1208 08:59:56.342891 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:56 crc kubenswrapper[4776]: E1208 08:59:56.343170 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.397744 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.397795 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.397805 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.397822 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.397834 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:56Z","lastTransitionTime":"2025-12-08T08:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.501115 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.501156 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.501168 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.501205 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.501217 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:56Z","lastTransitionTime":"2025-12-08T08:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.603167 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.603215 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.603223 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.603235 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.603244 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:56Z","lastTransitionTime":"2025-12-08T08:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.705518 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.705578 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.705592 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.705608 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.705619 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:56Z","lastTransitionTime":"2025-12-08T08:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.807789 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.807836 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.807847 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.807861 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.807869 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:56Z","lastTransitionTime":"2025-12-08T08:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.851636 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.851695 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.851707 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.851727 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.851739 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:56Z","lastTransitionTime":"2025-12-08T08:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:56 crc kubenswrapper[4776]: E1208 08:59:56.864279 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:56Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.867941 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.867979 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.867989 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.868004 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.868013 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:56Z","lastTransitionTime":"2025-12-08T08:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:56 crc kubenswrapper[4776]: E1208 08:59:56.881016 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:56Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.885812 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.886106 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.886133 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.886163 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.886211 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:56Z","lastTransitionTime":"2025-12-08T08:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:56 crc kubenswrapper[4776]: E1208 08:59:56.906106 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:56Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.911121 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.911212 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.911226 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.911244 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.911255 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:56Z","lastTransitionTime":"2025-12-08T08:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:56 crc kubenswrapper[4776]: E1208 08:59:56.927681 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:56Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.934821 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.934867 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.934882 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.934905 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.934921 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:56Z","lastTransitionTime":"2025-12-08T08:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:56 crc kubenswrapper[4776]: E1208 08:59:56.951132 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:56Z is after 2025-08-24T17:21:41Z" Dec 08 08:59:56 crc kubenswrapper[4776]: E1208 08:59:56.951310 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.953818 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.953879 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.953895 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.953920 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:56 crc kubenswrapper[4776]: I1208 08:59:56.953963 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:56Z","lastTransitionTime":"2025-12-08T08:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.056543 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.056574 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.056585 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.056723 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.056750 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:57Z","lastTransitionTime":"2025-12-08T08:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.159131 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.159220 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.159234 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.159250 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.159285 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:57Z","lastTransitionTime":"2025-12-08T08:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.262486 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.262531 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.262541 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.262556 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.262568 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:57Z","lastTransitionTime":"2025-12-08T08:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.343602 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:57 crc kubenswrapper[4776]: E1208 08:59:57.343764 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.343603 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:57 crc kubenswrapper[4776]: E1208 08:59:57.343896 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.365665 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.365728 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.365751 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.365781 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.365806 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:57Z","lastTransitionTime":"2025-12-08T08:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.469026 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.469118 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.469142 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.469209 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.469233 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:57Z","lastTransitionTime":"2025-12-08T08:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.574594 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.574631 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.574641 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.574657 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.574669 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:57Z","lastTransitionTime":"2025-12-08T08:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.678233 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.678300 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.678324 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.678352 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.678373 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:57Z","lastTransitionTime":"2025-12-08T08:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.781790 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.781860 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.781882 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.781912 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.781930 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:57Z","lastTransitionTime":"2025-12-08T08:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.886207 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.886262 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.886281 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.886307 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.886326 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:57Z","lastTransitionTime":"2025-12-08T08:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.990191 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.990264 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.990278 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.990312 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:57 crc kubenswrapper[4776]: I1208 08:59:57.990328 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:57Z","lastTransitionTime":"2025-12-08T08:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.092789 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.092818 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.092827 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.092840 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.092848 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:58Z","lastTransitionTime":"2025-12-08T08:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.195664 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.195702 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.195710 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.195755 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.195769 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:58Z","lastTransitionTime":"2025-12-08T08:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.298751 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.298782 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.298790 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.298802 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.298811 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:58Z","lastTransitionTime":"2025-12-08T08:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.343447 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 08:59:58 crc kubenswrapper[4776]: E1208 08:59:58.343660 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.344576 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 08:59:58 crc kubenswrapper[4776]: E1208 08:59:58.344818 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.357103 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.401098 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.401145 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.401153 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.401167 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.401195 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:58Z","lastTransitionTime":"2025-12-08T08:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.504382 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.504439 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.504460 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.504488 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.504509 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:58Z","lastTransitionTime":"2025-12-08T08:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.608132 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.608248 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.608274 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.608308 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.608328 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:58Z","lastTransitionTime":"2025-12-08T08:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.731597 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.731656 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.731672 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.731697 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.731714 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:58Z","lastTransitionTime":"2025-12-08T08:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.833609 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.833638 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.833646 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.833658 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.833668 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:58Z","lastTransitionTime":"2025-12-08T08:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.936629 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.936700 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.936724 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.936751 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:58 crc kubenswrapper[4776]: I1208 08:59:58.936808 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:58Z","lastTransitionTime":"2025-12-08T08:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.040526 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.040604 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.040627 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.040665 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.040696 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:59Z","lastTransitionTime":"2025-12-08T08:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.148266 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.148335 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.148352 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.148381 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.148405 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:59Z","lastTransitionTime":"2025-12-08T08:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.252573 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.253102 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.253234 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.253341 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.253438 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:59Z","lastTransitionTime":"2025-12-08T08:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.343508 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.343542 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 08:59:59 crc kubenswrapper[4776]: E1208 08:59:59.343755 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 08:59:59 crc kubenswrapper[4776]: E1208 08:59:59.343854 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.357451 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.357521 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.357538 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.357567 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.357587 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:59Z","lastTransitionTime":"2025-12-08T08:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.460948 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.461330 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.461455 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.461537 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.461611 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:59Z","lastTransitionTime":"2025-12-08T08:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.565132 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.565465 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.565576 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.565693 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.565786 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:59Z","lastTransitionTime":"2025-12-08T08:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.668972 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.669301 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.669407 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.669513 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.669656 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:59Z","lastTransitionTime":"2025-12-08T08:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.773378 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.773713 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.773887 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.774155 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.774472 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:59Z","lastTransitionTime":"2025-12-08T08:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.878111 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.878204 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.878222 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.878248 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.878269 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:59Z","lastTransitionTime":"2025-12-08T08:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.981675 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.982366 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.982424 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.982489 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 08:59:59 crc kubenswrapper[4776]: I1208 08:59:59.982514 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T08:59:59Z","lastTransitionTime":"2025-12-08T08:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.085513 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.085568 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.085585 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.085608 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.085627 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:00Z","lastTransitionTime":"2025-12-08T09:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.188485 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.188529 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.188541 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.188557 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.188569 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:00Z","lastTransitionTime":"2025-12-08T09:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.291299 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.291536 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.291646 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.291747 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.291845 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:00Z","lastTransitionTime":"2025-12-08T09:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.342709 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.342808 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:00 crc kubenswrapper[4776]: E1208 09:00:00.343120 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:00 crc kubenswrapper[4776]: E1208 09:00:00.343024 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.394921 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.394979 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.394993 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.395016 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.395033 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:00Z","lastTransitionTime":"2025-12-08T09:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.497771 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.497806 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.497815 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.497828 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.497837 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:00Z","lastTransitionTime":"2025-12-08T09:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.600954 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.601019 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.601040 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.601071 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.601098 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:00Z","lastTransitionTime":"2025-12-08T09:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.703499 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.703622 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.703638 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.703660 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.703673 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:00Z","lastTransitionTime":"2025-12-08T09:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.805635 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.805685 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.805700 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.805719 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.805735 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:00Z","lastTransitionTime":"2025-12-08T09:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.908110 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.908223 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.908247 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.908279 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:00 crc kubenswrapper[4776]: I1208 09:00:00.908301 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:00Z","lastTransitionTime":"2025-12-08T09:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.011207 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.011245 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.011259 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.011280 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.011295 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:01Z","lastTransitionTime":"2025-12-08T09:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.114277 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.114387 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.114409 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.114437 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.114458 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:01Z","lastTransitionTime":"2025-12-08T09:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.216934 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.216969 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.216978 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.216992 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.217002 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:01Z","lastTransitionTime":"2025-12-08T09:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.320620 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.320681 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.320697 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.320722 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.320740 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:01Z","lastTransitionTime":"2025-12-08T09:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.343240 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:01 crc kubenswrapper[4776]: E1208 09:00:01.343408 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.343234 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:01 crc kubenswrapper[4776]: E1208 09:00:01.343624 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.423746 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.424099 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.424233 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.424345 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.424442 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:01Z","lastTransitionTime":"2025-12-08T09:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.527284 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.527332 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.527342 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.527358 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.527368 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:01Z","lastTransitionTime":"2025-12-08T09:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.629755 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.629795 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.629805 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.629822 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.629833 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:01Z","lastTransitionTime":"2025-12-08T09:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.732577 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.732630 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.732646 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.732667 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.732681 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:01Z","lastTransitionTime":"2025-12-08T09:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.835499 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.835561 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.835577 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.835602 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.835620 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:01Z","lastTransitionTime":"2025-12-08T09:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.937564 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.937596 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.937604 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.937616 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:01 crc kubenswrapper[4776]: I1208 09:00:01.937642 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:01Z","lastTransitionTime":"2025-12-08T09:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.041306 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.041394 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.041422 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.041452 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.041474 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:02Z","lastTransitionTime":"2025-12-08T09:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.145999 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.146046 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.146060 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.146077 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.146089 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:02Z","lastTransitionTime":"2025-12-08T09:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.248799 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.248829 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.248839 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.248855 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.248866 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:02Z","lastTransitionTime":"2025-12-08T09:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.343161 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.343251 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:02 crc kubenswrapper[4776]: E1208 09:00:02.343309 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:02 crc kubenswrapper[4776]: E1208 09:00:02.343417 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.350611 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.350843 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.351163 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.351325 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.351420 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:02Z","lastTransitionTime":"2025-12-08T09:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.453669 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.454031 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.454275 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.454462 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.454657 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:02Z","lastTransitionTime":"2025-12-08T09:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.556968 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.557022 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.557143 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.557208 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.557231 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:02Z","lastTransitionTime":"2025-12-08T09:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.659768 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.665220 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.665276 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.665307 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.665326 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:02Z","lastTransitionTime":"2025-12-08T09:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.767962 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.767986 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.767994 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.768007 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.768016 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:02Z","lastTransitionTime":"2025-12-08T09:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.870307 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.870587 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.870675 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.870764 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.870847 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:02Z","lastTransitionTime":"2025-12-08T09:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.974357 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.975269 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.975479 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.975769 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:02 crc kubenswrapper[4776]: I1208 09:00:02.976089 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:02Z","lastTransitionTime":"2025-12-08T09:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.079461 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.079496 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.079504 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.079518 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.079529 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:03Z","lastTransitionTime":"2025-12-08T09:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.183023 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.183091 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.183116 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.183139 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.183153 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:03Z","lastTransitionTime":"2025-12-08T09:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.286242 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.286288 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.286339 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.286374 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.286397 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:03Z","lastTransitionTime":"2025-12-08T09:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.343750 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.344111 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:03 crc kubenswrapper[4776]: E1208 09:00:03.344295 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:03 crc kubenswrapper[4776]: E1208 09:00:03.344515 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.344623 4776 scope.go:117] "RemoveContainer" containerID="46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.390064 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.390133 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.390150 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.390202 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.390220 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:03Z","lastTransitionTime":"2025-12-08T09:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.493253 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.493309 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.493323 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.493343 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.493354 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:03Z","lastTransitionTime":"2025-12-08T09:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.596058 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.596104 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.596113 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.596130 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.596139 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:03Z","lastTransitionTime":"2025-12-08T09:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.699519 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.699612 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.699639 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.699674 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.699699 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:03Z","lastTransitionTime":"2025-12-08T09:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.802814 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.802850 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.802862 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.802878 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.802887 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:03Z","lastTransitionTime":"2025-12-08T09:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.905791 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.905834 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.905844 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.905859 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:03 crc kubenswrapper[4776]: I1208 09:00:03.905869 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:03Z","lastTransitionTime":"2025-12-08T09:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.008886 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.009002 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.009023 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.009049 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.009067 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:04Z","lastTransitionTime":"2025-12-08T09:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.111928 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.111979 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.111989 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.112004 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.112017 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:04Z","lastTransitionTime":"2025-12-08T09:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.215371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.215434 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.215453 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.215479 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.215496 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:04Z","lastTransitionTime":"2025-12-08T09:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.318051 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.318110 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.318121 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.318139 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.318150 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:04Z","lastTransitionTime":"2025-12-08T09:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.343684 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.343773 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:04 crc kubenswrapper[4776]: E1208 09:00:04.343902 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:04 crc kubenswrapper[4776]: E1208 09:00:04.344232 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.380920 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.399725 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.422034 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.422087 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.422103 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.422124 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.422139 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:04Z","lastTransitionTime":"2025-12-08T09:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.432073 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.454054 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6eb111dbbc6dec73baadf4e88ff08f03050658b7682b28c960ecdb80973eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:52Z\\\",\\\"message\\\":\\\"2025-12-08T08:59:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a3a44f90-75bc-4af4-8170-8121c4c73300\\\\n2025-12-08T08:59:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a3a44f90-75bc-4af4-8170-8121c4c73300 to /host/opt/cni/bin/\\\\n2025-12-08T08:59:05Z [verbose] multus-daemon started\\\\n2025-12-08T08:59:05Z [verbose] Readiness Indicator file check\\\\n2025-12-08T08:59:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.473083 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:34Z\\\",\\\"message\\\":\\\"\\\\nI1208 08:59:34.244549 6400 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1208 08:59:34.244563 6400 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1208 08:59:34.244135 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z]\\\\nI1208 08:59:34.244585 6400 services_controller.go:451] Built service openshift-network-diagnostics/network-check-target cluster-wide \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.483083 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"613586f5-df47-4178-b711-fb8f9f2fdf6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e6618291bd02472481cb1d5469287732dd869be2767bd9209c9f5b846b6b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53120dd7e9433a30e26bce760e1424a89d586cebf2f627af5885d8e43f9c731b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53120dd7e9433a30e26bce760e1424a89d586cebf2f627af5885d8e43f9c731b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.495020 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.509591 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.521473 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.524933 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.524987 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.525000 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.525017 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.525028 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:04Z","lastTransitionTime":"2025-12-08T09:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.532232 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.540613 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.551473 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.561958 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.573406 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.584343 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.596060 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bd7d27-06e1-4574-8857-6adbe88633c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d32a10a86fe749d233a68a8e7583294e21c634dc47febe04e56220b591d505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c567d34bcaecb124f79504fee8f22c148f78bb039741a7b52883ab3188edaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://146374df9edb9e0092cf2e4cac4a5955d7d0980be93df8188f4b55ad12901572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.607671 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.618194 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.627453 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.627508 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.627524 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.627546 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.627560 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:04Z","lastTransitionTime":"2025-12-08T09:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.627699 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:04Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.731153 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.731222 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.731238 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.731257 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.731269 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:04Z","lastTransitionTime":"2025-12-08T09:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.834221 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.834260 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.834270 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.834312 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.834324 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:04Z","lastTransitionTime":"2025-12-08T09:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.973458 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.973538 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.973557 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.973586 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:04 crc kubenswrapper[4776]: I1208 09:00:04.973609 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:04Z","lastTransitionTime":"2025-12-08T09:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.077536 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.077573 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.077581 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.077598 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.077611 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:05Z","lastTransitionTime":"2025-12-08T09:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.181031 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.181077 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.181111 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.181128 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.181138 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:05Z","lastTransitionTime":"2025-12-08T09:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.284307 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.284385 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.284403 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.284435 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.284455 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:05Z","lastTransitionTime":"2025-12-08T09:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.343743 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.343802 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:05 crc kubenswrapper[4776]: E1208 09:00:05.343974 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:05 crc kubenswrapper[4776]: E1208 09:00:05.344164 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.388101 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.388209 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.388230 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.388258 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.388282 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:05Z","lastTransitionTime":"2025-12-08T09:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.491810 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.491863 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.491878 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.491899 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.491914 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:05Z","lastTransitionTime":"2025-12-08T09:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.594072 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.594114 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.594134 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.594150 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.594162 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:05Z","lastTransitionTime":"2025-12-08T09:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.696772 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.696807 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.696818 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.696847 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.696858 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:05Z","lastTransitionTime":"2025-12-08T09:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.799260 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.799292 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.799300 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.799312 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.799322 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:05Z","lastTransitionTime":"2025-12-08T09:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.843636 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovnkube-controller/2.log" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.846015 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerStarted","Data":"b728069c5c670cfef1888e64d211dfcfefb2de8c9ea9cf0a346c4538578b557e"} Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.846434 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.862058 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bd7d27-06e1-4574-8857-6adbe88633c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d32a10a86fe749d233a68a8e7583294e21c634dc47febe04e56220b591d505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c567d34bcaecb124f79504fee8f22c148f78bb039741a7b52883ab3188edaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://146374df9edb9e0092cf2e4cac4a5955d7d0980be93df8188f4b55ad12901572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:05Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.870927 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:05Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.878929 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.879035 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.879089 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.879116 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:05 crc kubenswrapper[4776]: E1208 09:00:05.879136 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:09.87912086 +0000 UTC m=+146.142345882 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:00:05 crc kubenswrapper[4776]: E1208 09:00:05.879204 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 09:00:05 crc kubenswrapper[4776]: E1208 09:00:05.879247 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 09:01:09.879236933 +0000 UTC m=+146.142461955 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 09:00:05 crc kubenswrapper[4776]: E1208 09:00:05.879492 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 09:00:05 crc kubenswrapper[4776]: E1208 09:00:05.879520 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 09:00:05 crc kubenswrapper[4776]: E1208 09:00:05.879533 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 09:00:05 crc kubenswrapper[4776]: E1208 09:00:05.879565 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-08 09:01:09.879555641 +0000 UTC m=+146.142780663 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 09:00:05 crc kubenswrapper[4776]: E1208 09:00:05.879730 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 09:00:05 crc kubenswrapper[4776]: E1208 09:00:05.879771 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 09:01:09.879761097 +0000 UTC m=+146.142986209 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.880449 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:05Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.888450 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:05Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.901294 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.901322 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.901331 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.901344 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.901356 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:05Z","lastTransitionTime":"2025-12-08T09:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.907867 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:05Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.918000 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:05Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.929609 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:05Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.950900 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b728069c5c670cfef1888e64d211dfcfefb2de8c9ea9cf0a346c4538578b557e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:34Z\\\",\\\"message\\\":\\\"\\\\nI1208 08:59:34.244549 6400 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1208 08:59:34.244563 6400 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1208 08:59:34.244135 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z]\\\\nI1208 08:59:34.244585 6400 services_controller.go:451] Built service openshift-network-diagnostics/network-check-target cluster-wide \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T09:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:05Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.961860 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"613586f5-df47-4178-b711-fb8f9f2fdf6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e6618291bd02472481cb1d5469287732dd869be2767bd9209c9f5b846b6b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53120dd7e9433a30e26bce760e1424a89d586cebf2f627af5885d8e43f9c731b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53120dd7e9433a30e26bce760e1424a89d586cebf2f627af5885d8e43f9c731b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:05Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.980098 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:05 crc kubenswrapper[4776]: E1208 09:00:05.980270 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 09:00:05 crc kubenswrapper[4776]: E1208 09:00:05.980295 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 09:00:05 crc kubenswrapper[4776]: E1208 09:00:05.980307 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 09:00:05 crc kubenswrapper[4776]: E1208 09:00:05.980543 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-08 09:01:09.980527989 +0000 UTC m=+146.243753011 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 09:00:05 crc kubenswrapper[4776]: I1208 09:00:05.982667 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:05Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.002076 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:05Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.003297 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.003333 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.003344 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.003359 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.003369 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:06Z","lastTransitionTime":"2025-12-08T09:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.021914 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:06Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.033462 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:06Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.044085 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:06Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.055838 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6eb111dbbc6dec73baadf4e88ff08f03050658b7682b28c960ecdb80973eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:52Z\\\",\\\"message\\\":\\\"2025-12-08T08:59:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a3a44f90-75bc-4af4-8170-8121c4c73300\\\\n2025-12-08T08:59:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a3a44f90-75bc-4af4-8170-8121c4c73300 to /host/opt/cni/bin/\\\\n2025-12-08T08:59:05Z [verbose] multus-daemon started\\\\n2025-12-08T08:59:05Z [verbose] Readiness Indicator file check\\\\n2025-12-08T08:59:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:06Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.066452 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:06Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.078198 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:06Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.088015 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:06Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.096542 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:06Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.106245 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.106286 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.106296 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.106310 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.106319 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:06Z","lastTransitionTime":"2025-12-08T09:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.208329 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.208361 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.208369 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.208382 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.208391 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:06Z","lastTransitionTime":"2025-12-08T09:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.310982 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.311029 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.311040 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.311057 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.311070 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:06Z","lastTransitionTime":"2025-12-08T09:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.342903 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.342909 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:06 crc kubenswrapper[4776]: E1208 09:00:06.343055 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:06 crc kubenswrapper[4776]: E1208 09:00:06.343110 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.413639 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.413680 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.413690 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.413706 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.413717 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:06Z","lastTransitionTime":"2025-12-08T09:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.516199 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.516235 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.516244 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.516258 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.516268 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:06Z","lastTransitionTime":"2025-12-08T09:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.618489 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.618554 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.618571 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.618591 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.618607 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:06Z","lastTransitionTime":"2025-12-08T09:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.721470 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.721513 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.721524 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.721540 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.721552 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:06Z","lastTransitionTime":"2025-12-08T09:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.824400 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.824456 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.824470 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.824488 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.824500 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:06Z","lastTransitionTime":"2025-12-08T09:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.851748 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovnkube-controller/3.log" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.852459 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovnkube-controller/2.log" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.855480 4776 generic.go:334] "Generic (PLEG): container finished" podID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerID="b728069c5c670cfef1888e64d211dfcfefb2de8c9ea9cf0a346c4538578b557e" exitCode=1 Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.855528 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerDied","Data":"b728069c5c670cfef1888e64d211dfcfefb2de8c9ea9cf0a346c4538578b557e"} Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.855579 4776 scope.go:117] "RemoveContainer" containerID="46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.856923 4776 scope.go:117] "RemoveContainer" containerID="b728069c5c670cfef1888e64d211dfcfefb2de8c9ea9cf0a346c4538578b557e" Dec 08 09:00:06 crc kubenswrapper[4776]: E1208 09:00:06.857321 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\"" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.878197 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6eb111dbbc6dec73baadf4e88ff08f03050658b7682b28c960ecdb80973eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:52Z\\\",\\\"message\\\":\\\"2025-12-08T08:59:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a3a44f90-75bc-4af4-8170-8121c4c73300\\\\n2025-12-08T08:59:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a3a44f90-75bc-4af4-8170-8121c4c73300 to /host/opt/cni/bin/\\\\n2025-12-08T08:59:05Z [verbose] multus-daemon started\\\\n2025-12-08T08:59:05Z [verbose] Readiness Indicator file check\\\\n2025-12-08T08:59:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:06Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.897670 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b728069c5c670cfef1888e64d211dfcfefb2de8c9ea9cf0a346c4538578b557e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46c3d0b09700f82a78bbde1ae9d9c6ff3538958396aac791116ea0eff0f5a9cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:34Z\\\",\\\"message\\\":\\\"\\\\nI1208 08:59:34.244549 6400 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1208 08:59:34.244563 6400 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1208 08:59:34.244135 6400 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T08:59:34Z is after 2025-08-24T17:21:41Z]\\\\nI1208 08:59:34.244585 6400 services_controller.go:451] Built service openshift-network-diagnostics/network-check-target cluster-wide \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b728069c5c670cfef1888e64d211dfcfefb2de8c9ea9cf0a346c4538578b557e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T09:00:06Z\\\",\\\"message\\\":\\\" 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1208 09:00:05.955865 6790 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:05Z is after 2025-08-24T17:21:41Z]\\\\nI1208 09:00:05.955922 6790 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-5x9ft in node crc\\\\nI1208 09:00:05.955964 6790 model_cli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T09:00:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:06Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.907707 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"613586f5-df47-4178-b711-fb8f9f2fdf6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e6618291bd02472481cb1d5469287732dd869be2767bd9209c9f5b846b6b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53120dd7e9433a30e26bce760e1424a89d586cebf2f627af5885d8e43f9c731b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53120dd7e9433a30e26bce760e1424a89d586cebf2f627af5885d8e43f9c731b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:06Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.926660 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.926703 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.926718 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.926738 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.926756 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:06Z","lastTransitionTime":"2025-12-08T09:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.929091 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:06Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.945431 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:06Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.958853 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:06Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.976839 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:06Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:06 crc kubenswrapper[4776]: I1208 09:00:06.989864 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:06Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.002561 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.015441 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.028186 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.029059 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.029097 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.029108 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.029126 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.029139 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:07Z","lastTransitionTime":"2025-12-08T09:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.040442 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.051106 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bd7d27-06e1-4574-8857-6adbe88633c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d32a10a86fe749d233a68a8e7583294e21c634dc47febe04e56220b591d505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c567d34bcaecb124f79504fee8f22c148f78bb039741a7b52883ab3188edaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://146374df9edb9e0092cf2e4cac4a5955d7d0980be93df8188f4b55ad12901572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.060165 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.069332 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.079588 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.096688 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.108796 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.122222 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.131867 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.131986 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.132047 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.132121 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.132198 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:07Z","lastTransitionTime":"2025-12-08T09:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.235595 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.236298 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.236426 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.236535 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.236632 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:07Z","lastTransitionTime":"2025-12-08T09:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.291751 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.291893 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.291960 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.292031 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.292093 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:07Z","lastTransitionTime":"2025-12-08T09:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:07 crc kubenswrapper[4776]: E1208 09:00:07.304257 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.307099 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.307123 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.307131 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.307142 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.307150 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:07Z","lastTransitionTime":"2025-12-08T09:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:07 crc kubenswrapper[4776]: E1208 09:00:07.320765 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.324356 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.324386 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.324394 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.324408 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.324417 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:07Z","lastTransitionTime":"2025-12-08T09:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:07 crc kubenswrapper[4776]: E1208 09:00:07.336477 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.340238 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.340287 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.340299 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.340316 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.340330 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:07Z","lastTransitionTime":"2025-12-08T09:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.343167 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:07 crc kubenswrapper[4776]: E1208 09:00:07.343287 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.343378 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:07 crc kubenswrapper[4776]: E1208 09:00:07.343488 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:07 crc kubenswrapper[4776]: E1208 09:00:07.356845 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.360300 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.360331 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.360340 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.360353 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.360362 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:07Z","lastTransitionTime":"2025-12-08T09:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:07 crc kubenswrapper[4776]: E1208 09:00:07.374112 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T09:00:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ebf5967-b40e-4612-8f34-c965ce3a7e5b\\\",\\\"systemUUID\\\":\\\"c2909369-742b-49a0-ae37-af59748afd08\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: E1208 09:00:07.374436 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.375539 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.375629 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.375688 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.375754 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.375827 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:07Z","lastTransitionTime":"2025-12-08T09:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.478236 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.478498 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.478565 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.478636 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.478826 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:07Z","lastTransitionTime":"2025-12-08T09:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.583584 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.583623 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.583632 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.583645 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.583655 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:07Z","lastTransitionTime":"2025-12-08T09:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.686541 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.686613 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.686627 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.686644 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.686657 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:07Z","lastTransitionTime":"2025-12-08T09:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.789245 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.789736 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.789861 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.789974 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.790068 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:07Z","lastTransitionTime":"2025-12-08T09:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.861352 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovnkube-controller/3.log" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.864616 4776 scope.go:117] "RemoveContainer" containerID="b728069c5c670cfef1888e64d211dfcfefb2de8c9ea9cf0a346c4538578b557e" Dec 08 09:00:07 crc kubenswrapper[4776]: E1208 09:00:07.864757 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\"" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.891300 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b728069c5c670cfef1888e64d211dfcfefb2de8c9ea9cf0a346c4538578b557e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b728069c5c670cfef1888e64d211dfcfefb2de8c9ea9cf0a346c4538578b557e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T09:00:06Z\\\",\\\"message\\\":\\\" 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1208 09:00:05.955865 6790 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:05Z is after 2025-08-24T17:21:41Z]\\\\nI1208 09:00:05.955922 6790 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-5x9ft in node crc\\\\nI1208 09:00:05.955964 6790 model_cli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T09:00:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.892112 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.892526 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.892704 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.892813 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.892895 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:07Z","lastTransitionTime":"2025-12-08T09:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.907055 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"613586f5-df47-4178-b711-fb8f9f2fdf6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e6618291bd02472481cb1d5469287732dd869be2767bd9209c9f5b846b6b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53120dd7e9433a30e26bce760e1424a89d586cebf2f627af5885d8e43f9c731b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53120dd7e9433a30e26bce760e1424a89d586cebf2f627af5885d8e43f9c731b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.924432 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.936271 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.947213 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.966654 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.977925 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.995535 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.995875 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.996020 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.996159 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.996328 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:07Z","lastTransitionTime":"2025-12-08T09:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:07 crc kubenswrapper[4776]: I1208 09:00:07.999626 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6eb111dbbc6dec73baadf4e88ff08f03050658b7682b28c960ecdb80973eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:52Z\\\",\\\"message\\\":\\\"2025-12-08T08:59:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a3a44f90-75bc-4af4-8170-8121c4c73300\\\\n2025-12-08T08:59:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a3a44f90-75bc-4af4-8170-8121c4c73300 to /host/opt/cni/bin/\\\\n2025-12-08T08:59:05Z [verbose] multus-daemon started\\\\n2025-12-08T08:59:05Z [verbose] Readiness Indicator file check\\\\n2025-12-08T08:59:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:07Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.012668 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:08Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.026721 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:08Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.037843 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:08Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.054236 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:08Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.066460 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bd7d27-06e1-4574-8857-6adbe88633c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d32a10a86fe749d233a68a8e7583294e21c634dc47febe04e56220b591d505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c567d34bcaecb124f79504fee8f22c148f78bb039741a7b52883ab3188edaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://146374df9edb9e0092cf2e4cac4a5955d7d0980be93df8188f4b55ad12901572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:08Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.076618 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:08Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.088606 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:08Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.098872 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.099034 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.099147 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.099258 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.099334 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:08Z","lastTransitionTime":"2025-12-08T09:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.100241 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:08Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.124047 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:08Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.141372 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:08Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.155023 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:08Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.201855 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.201885 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.201894 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.201906 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.201916 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:08Z","lastTransitionTime":"2025-12-08T09:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.304291 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.304808 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.304904 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.304975 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.305033 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:08Z","lastTransitionTime":"2025-12-08T09:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.342986 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:08 crc kubenswrapper[4776]: E1208 09:00:08.343320 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.343073 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:08 crc kubenswrapper[4776]: E1208 09:00:08.343606 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.408056 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.408115 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.408136 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.408163 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.408211 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:08Z","lastTransitionTime":"2025-12-08T09:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.511532 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.511583 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.511603 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.511628 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.511645 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:08Z","lastTransitionTime":"2025-12-08T09:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.615527 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.615591 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.615610 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.615640 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.615659 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:08Z","lastTransitionTime":"2025-12-08T09:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.718308 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.718349 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.718360 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.718390 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.718402 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:08Z","lastTransitionTime":"2025-12-08T09:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.820602 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.820637 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.820648 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.820663 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.820675 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:08Z","lastTransitionTime":"2025-12-08T09:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.922780 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.922818 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.922829 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.922844 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:08 crc kubenswrapper[4776]: I1208 09:00:08.922856 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:08Z","lastTransitionTime":"2025-12-08T09:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.026024 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.026105 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.026118 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.026138 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.026150 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:09Z","lastTransitionTime":"2025-12-08T09:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.128838 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.128899 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.128914 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.128937 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.128955 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:09Z","lastTransitionTime":"2025-12-08T09:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.231477 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.231527 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.231538 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.231556 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.231567 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:09Z","lastTransitionTime":"2025-12-08T09:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.333881 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.333925 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.333936 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.333956 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.333970 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:09Z","lastTransitionTime":"2025-12-08T09:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.343494 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.343582 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:09 crc kubenswrapper[4776]: E1208 09:00:09.343619 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:09 crc kubenswrapper[4776]: E1208 09:00:09.343972 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.437384 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.437456 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.437579 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.437604 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.437633 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:09Z","lastTransitionTime":"2025-12-08T09:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.539772 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.539826 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.539836 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.539855 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.539865 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:09Z","lastTransitionTime":"2025-12-08T09:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.642285 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.642336 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.642345 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.642362 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.642371 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:09Z","lastTransitionTime":"2025-12-08T09:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.745322 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.745373 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.745389 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.745409 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.745424 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:09Z","lastTransitionTime":"2025-12-08T09:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.847831 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.847868 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.847885 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.847906 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.847922 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:09Z","lastTransitionTime":"2025-12-08T09:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.949887 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.949924 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.949934 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.949948 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:09 crc kubenswrapper[4776]: I1208 09:00:09.949958 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:09Z","lastTransitionTime":"2025-12-08T09:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.054565 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.054629 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.054642 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.054662 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.054684 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:10Z","lastTransitionTime":"2025-12-08T09:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.157328 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.157371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.157383 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.157401 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.157415 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:10Z","lastTransitionTime":"2025-12-08T09:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.259662 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.259701 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.259713 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.259730 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.259742 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:10Z","lastTransitionTime":"2025-12-08T09:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.343684 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:10 crc kubenswrapper[4776]: E1208 09:00:10.343827 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.343895 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:10 crc kubenswrapper[4776]: E1208 09:00:10.344097 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.362427 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.362484 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.362497 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.362515 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.362529 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:10Z","lastTransitionTime":"2025-12-08T09:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.464798 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.464851 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.464864 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.464883 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.464909 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:10Z","lastTransitionTime":"2025-12-08T09:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.567284 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.567332 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.567344 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.567361 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.567374 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:10Z","lastTransitionTime":"2025-12-08T09:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.669983 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.670031 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.670042 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.670062 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.670077 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:10Z","lastTransitionTime":"2025-12-08T09:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.773139 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.773251 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.773268 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.773290 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.773308 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:10Z","lastTransitionTime":"2025-12-08T09:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.876078 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.876131 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.876150 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.876194 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.876212 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:10Z","lastTransitionTime":"2025-12-08T09:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.979463 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.979537 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.979551 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.979569 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:10 crc kubenswrapper[4776]: I1208 09:00:10.979577 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:10Z","lastTransitionTime":"2025-12-08T09:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.083443 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.083506 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.083524 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.083549 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.083566 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:11Z","lastTransitionTime":"2025-12-08T09:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.186702 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.186748 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.186764 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.186783 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.186794 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:11Z","lastTransitionTime":"2025-12-08T09:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.289047 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.289095 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.289105 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.289119 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.289129 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:11Z","lastTransitionTime":"2025-12-08T09:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.343118 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:11 crc kubenswrapper[4776]: E1208 09:00:11.343276 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.343124 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:11 crc kubenswrapper[4776]: E1208 09:00:11.343441 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.392387 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.392424 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.392432 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.392449 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.392458 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:11Z","lastTransitionTime":"2025-12-08T09:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.495466 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.495533 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.495551 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.495579 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.495600 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:11Z","lastTransitionTime":"2025-12-08T09:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.598611 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.598670 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.598693 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.598715 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.598733 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:11Z","lastTransitionTime":"2025-12-08T09:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.701413 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.701450 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.701462 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.701481 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.701494 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:11Z","lastTransitionTime":"2025-12-08T09:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.803686 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.803713 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.803722 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.803734 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.803742 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:11Z","lastTransitionTime":"2025-12-08T09:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.906428 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.906463 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.906470 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.906482 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:11 crc kubenswrapper[4776]: I1208 09:00:11.906491 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:11Z","lastTransitionTime":"2025-12-08T09:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.009056 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.009095 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.009103 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.009118 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.009127 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:12Z","lastTransitionTime":"2025-12-08T09:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.111327 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.111371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.111386 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.111403 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.111416 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:12Z","lastTransitionTime":"2025-12-08T09:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.213767 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.213805 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.213822 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.213838 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.213849 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:12Z","lastTransitionTime":"2025-12-08T09:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.316415 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.316448 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.316456 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.316488 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.316498 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:12Z","lastTransitionTime":"2025-12-08T09:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.343529 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:12 crc kubenswrapper[4776]: E1208 09:00:12.343733 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.344249 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:12 crc kubenswrapper[4776]: E1208 09:00:12.344387 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.419201 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.419255 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.419270 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.419289 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.419302 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:12Z","lastTransitionTime":"2025-12-08T09:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.521970 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.522013 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.522022 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.522040 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.522059 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:12Z","lastTransitionTime":"2025-12-08T09:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.625290 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.625333 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.625342 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.625356 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.625366 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:12Z","lastTransitionTime":"2025-12-08T09:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.727917 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.727984 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.727996 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.728013 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.728025 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:12Z","lastTransitionTime":"2025-12-08T09:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.830607 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.830652 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.830662 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.830676 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.830685 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:12Z","lastTransitionTime":"2025-12-08T09:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.932991 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.933031 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.933043 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.933059 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:12 crc kubenswrapper[4776]: I1208 09:00:12.933071 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:12Z","lastTransitionTime":"2025-12-08T09:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.035647 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.035730 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.035755 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.035786 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.035810 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:13Z","lastTransitionTime":"2025-12-08T09:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.138530 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.138596 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.138613 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.138636 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.138652 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:13Z","lastTransitionTime":"2025-12-08T09:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.241808 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.241843 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.241854 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.241869 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.241879 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:13Z","lastTransitionTime":"2025-12-08T09:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.342869 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.342869 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:13 crc kubenswrapper[4776]: E1208 09:00:13.343056 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:13 crc kubenswrapper[4776]: E1208 09:00:13.343244 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.344819 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.344854 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.344864 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.344877 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.344931 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:13Z","lastTransitionTime":"2025-12-08T09:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.447049 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.447086 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.447094 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.447108 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.447118 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:13Z","lastTransitionTime":"2025-12-08T09:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.550537 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.550841 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.550874 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.550907 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.550932 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:13Z","lastTransitionTime":"2025-12-08T09:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.653299 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.653364 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.653379 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.653405 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.653430 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:13Z","lastTransitionTime":"2025-12-08T09:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.755850 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.755889 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.755897 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.755910 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.755918 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:13Z","lastTransitionTime":"2025-12-08T09:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.858604 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.858639 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.858647 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.858663 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.858673 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:13Z","lastTransitionTime":"2025-12-08T09:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.961852 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.961886 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.961898 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.961912 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:13 crc kubenswrapper[4776]: I1208 09:00:13.961923 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:13Z","lastTransitionTime":"2025-12-08T09:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.065577 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.065822 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.065845 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.065875 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.065895 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:14Z","lastTransitionTime":"2025-12-08T09:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.168543 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.168578 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.168590 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.168606 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.168617 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:14Z","lastTransitionTime":"2025-12-08T09:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.271777 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.271814 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.271825 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.271841 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.271852 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:14Z","lastTransitionTime":"2025-12-08T09:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.343841 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:14 crc kubenswrapper[4776]: E1208 09:00:14.343983 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.344080 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:14 crc kubenswrapper[4776]: E1208 09:00:14.344427 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.361505 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d4304c3-81ab-4418-b5c2-017f74581e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d02311f25656790164bb155ddb4e4d32fb9e7919033dd47c38c9fb321c185743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be261308f50cfa4db35cf58e829c102496de88ed0336685562f39e46e498a460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08db453e1dd51d190b2bd855af295af04cb4d3a8a42e0020136c50546aef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.374614 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.374662 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.374688 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.374715 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.374728 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:14Z","lastTransitionTime":"2025-12-08T09:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.374747 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343d6e00-54f7-4228-a8da-a43041894b26\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.390274 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.404483 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.415613 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fdg6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56dfa7df-2ee8-4408-a283-5a8521175a0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5173ff239c6373649911372a6d6c1665958296afec2528b9e0a492a0722f5b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z44lf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fdg6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.429428 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-555j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6eb111dbbc6dec73baadf4e88ff08f03050658b7682b28c960ecdb80973eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T08:59:52Z\\\",\\\"message\\\":\\\"2025-12-08T08:59:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a3a44f90-75bc-4af4-8170-8121c4c73300\\\\n2025-12-08T08:59:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a3a44f90-75bc-4af4-8170-8121c4c73300 to /host/opt/cni/bin/\\\\n2025-12-08T08:59:05Z [verbose] multus-daemon started\\\\n2025-12-08T08:59:05Z [verbose] Readiness Indicator file check\\\\n2025-12-08T08:59:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-555j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.449925 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e518469-5b3b-4055-a0f0-075dc48b1c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b728069c5c670cfef1888e64d211dfcfefb2de8c9ea9cf0a346c4538578b557e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b728069c5c670cfef1888e64d211dfcfefb2de8c9ea9cf0a346c4538578b557e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T09:00:06Z\\\",\\\"message\\\":\\\" 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1208 09:00:05.955865 6790 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:05Z is after 2025-08-24T17:21:41Z]\\\\nI1208 09:00:05.955922 6790 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-5x9ft in node crc\\\\nI1208 09:00:05.955964 6790 model_cli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T09:00:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xks4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-swbsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.462338 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"613586f5-df47-4178-b711-fb8f9f2fdf6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e6618291bd02472481cb1d5469287732dd869be2767bd9209c9f5b846b6b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53120dd7e9433a30e26bce760e1424a89d586cebf2f627af5885d8e43f9c731b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53120dd7e9433a30e26bce760e1424a89d586cebf2f627af5885d8e43f9c731b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.475882 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b411334ce6d2d6fc026625d538807bc1dce879f0053babe7d97f302819a4e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e498b38932d3fd2da572e5ca5d560b7725a394f13fd9a30543244b1f3c0dca61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.477307 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.477351 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.477363 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.477381 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.477394 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:14Z","lastTransitionTime":"2025-12-08T09:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.488332 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3da360484aaa537076345b7a6830a83ac2a89ffb47f86d6865049c4f19688546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.499780 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9788ab1-1031-4103-a769-a4b3177c7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c2915bc116e8d911ac3c615649564174eb7e760c7e63fe06aee942d7cc2e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42lpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jkmbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.513841 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed339964d431767a3616559947e832417f1052991849f141c380c6e64c3c03f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.525134 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8k6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5628a2b5-b886-4883-93a3-fefc471f19e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df73be62036047cccc38e737abc2818aa510b308934c0bfbcac348aba382f20d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58b2w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8k6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.543273 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d62ee56-e13f-4e44-8abf-9d0fc3e423b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db449e03630aa1b44e8a2812e502a46239824bb59283396bfe92bb818df29fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b483925a2a4fefd06621185c21e74da28d1f0ababf703e2637a8686881671f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plplv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q85ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.555713 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99143b9c-a541-4c0e-8387-0dff0d557974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vlfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kkhjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.568627 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bd7d27-06e1-4574-8857-6adbe88633c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d32a10a86fe749d233a68a8e7583294e21c634dc47febe04e56220b591d505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c567d34bcaecb124f79504fee8f22c148f78bb039741a7b52883ab3188edaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://146374df9edb9e0092cf2e4cac4a5955d7d0980be93df8188f4b55ad12901572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc26419aaafbbe056695868d3b76642b62774f620306fc60a3c0a07788ca8b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.580163 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.580225 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.580237 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.580297 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.580311 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:14Z","lastTransitionTime":"2025-12-08T09:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.586580 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.607732 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58507405-6bea-4859-a4e8-6ed046b50323\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06ef510d9c943d7c5d8a7a801dc1e7ca8cbfa7f6995ac8bedc835e0e8a762f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c807ce7739e6d9fa8258a6b6701da25c2e053bed5dcf387f2e7cc5baf98c06d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64af97c7f44811eebf307d2209e20884d47235dd531e8e24f7ebeefb07200a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ac927baf37edfe8ccdaecc19f4f4d104783e81aa8bd87b24affbf0ecb05fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa55285a5f73f62b88798baa635fc8b2032af91ec615fabcb56af1e96407c4be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51ce4cdb88ff188767405fbefc5d117fe39010c638c347e33a8b674e6105330d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b00cd36b0df5ecff7d32b6b17dfac9de1d434e33dcac1057135f3992f9866af3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlvpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:59:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5x9ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.625631 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977ead65-d7ad-477b-8656-2b55e70b918b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T08:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e7ab99c039bd5c0c0ab52768821797dd9e303afb9c6065ef021e8c2be372df9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3089eb6893beac1862850a94ef99af25f4ce9c0f020cc795611d9a620966e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86e9bc666e17ca52777d0275d4a50cbb70aad0c77e3413328fa6bd7b42d8a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cac944c0f3bfb4cc7d00471deb5d44d498a7131b7d950305a78bd8a5d2ea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7656c1155b52195abce09bca5ecd214a957ae7009293b99f16b3390c671fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T08:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905822f40e6914afda4c0887b27399ccde2c237405a877ee5a968979b90bd5f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df5aa5e3286950ba8820b4784a0ad314ca461bb1f8038041a5b6d9463f028670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ae997fa1b3ef820a2d276a48c2c41e4243b1cdeb91f2effd7307007c3d58388\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T08:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T08:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T08:58:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T09:00:14Z is after 2025-08-24T17:21:41Z" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.682711 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.682756 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.682772 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.682794 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.682810 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:14Z","lastTransitionTime":"2025-12-08T09:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.785560 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.785639 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.785656 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.785679 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.785739 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:14Z","lastTransitionTime":"2025-12-08T09:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.889982 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.890011 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.890021 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.890035 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.890045 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:14Z","lastTransitionTime":"2025-12-08T09:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.993023 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.993076 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.993089 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.993106 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:14 crc kubenswrapper[4776]: I1208 09:00:14.993512 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:14Z","lastTransitionTime":"2025-12-08T09:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.096691 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.096737 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.096749 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.096768 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.096781 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:15Z","lastTransitionTime":"2025-12-08T09:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.199486 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.199523 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.199532 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.199545 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.199554 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:15Z","lastTransitionTime":"2025-12-08T09:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.302323 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.302370 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.302386 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.302409 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.302427 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:15Z","lastTransitionTime":"2025-12-08T09:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.342589 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.342626 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:15 crc kubenswrapper[4776]: E1208 09:00:15.342869 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:15 crc kubenswrapper[4776]: E1208 09:00:15.343015 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.404623 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.404661 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.404671 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.404687 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.404699 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:15Z","lastTransitionTime":"2025-12-08T09:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.507305 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.507340 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.507348 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.507361 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.507371 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:15Z","lastTransitionTime":"2025-12-08T09:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.609595 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.609622 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.609630 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.609642 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.609651 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:15Z","lastTransitionTime":"2025-12-08T09:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.712321 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.712355 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.712363 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.712378 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.712386 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:15Z","lastTransitionTime":"2025-12-08T09:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.815235 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.815273 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.815285 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.815301 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.815313 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:15Z","lastTransitionTime":"2025-12-08T09:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.917118 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.917535 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.917667 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.917798 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:15 crc kubenswrapper[4776]: I1208 09:00:15.917913 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:15Z","lastTransitionTime":"2025-12-08T09:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.021003 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.021468 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.021629 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.021778 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.021938 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:16Z","lastTransitionTime":"2025-12-08T09:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.124679 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.124709 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.124720 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.124735 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.124747 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:16Z","lastTransitionTime":"2025-12-08T09:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.227291 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.227551 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.227635 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.227721 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.227785 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:16Z","lastTransitionTime":"2025-12-08T09:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.330352 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.330382 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.330391 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.330403 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.330413 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:16Z","lastTransitionTime":"2025-12-08T09:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.343135 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:16 crc kubenswrapper[4776]: E1208 09:00:16.343466 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.343135 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:16 crc kubenswrapper[4776]: E1208 09:00:16.343732 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.432271 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.432310 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.432319 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.432334 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.432344 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:16Z","lastTransitionTime":"2025-12-08T09:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.534957 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.534994 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.535005 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.535020 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.535030 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:16Z","lastTransitionTime":"2025-12-08T09:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.638996 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.639029 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.639039 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.639052 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.639062 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:16Z","lastTransitionTime":"2025-12-08T09:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.742229 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.742589 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.742789 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.743015 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.743247 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:16Z","lastTransitionTime":"2025-12-08T09:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.847212 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.847588 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.847749 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.847920 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.848079 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:16Z","lastTransitionTime":"2025-12-08T09:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.951454 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.951495 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.951505 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.951521 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:16 crc kubenswrapper[4776]: I1208 09:00:16.951533 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:16Z","lastTransitionTime":"2025-12-08T09:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.054298 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.054338 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.054350 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.054368 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.054379 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:17Z","lastTransitionTime":"2025-12-08T09:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.156978 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.157452 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.157641 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.157801 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.157974 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:17Z","lastTransitionTime":"2025-12-08T09:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.261652 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.261933 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.262031 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.262129 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.262245 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:17Z","lastTransitionTime":"2025-12-08T09:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.342633 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:17 crc kubenswrapper[4776]: E1208 09:00:17.342852 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.343113 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:17 crc kubenswrapper[4776]: E1208 09:00:17.343332 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.364490 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.364532 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.364543 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.364560 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.364571 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:17Z","lastTransitionTime":"2025-12-08T09:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.466476 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.466733 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.466833 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.466928 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.467011 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:17Z","lastTransitionTime":"2025-12-08T09:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.569660 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.569695 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.569706 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.569721 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.569731 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:17Z","lastTransitionTime":"2025-12-08T09:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.612098 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.612125 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.612133 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.612146 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.612156 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T09:00:17Z","lastTransitionTime":"2025-12-08T09:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.659063 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq"] Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.659661 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.661516 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.661525 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.662435 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.665269 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.691352 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c59a220-4703-4167-b85f-78313fcf5aee-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jmggq\" (UID: \"0c59a220-4703-4167-b85f-78313fcf5aee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.691426 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0c59a220-4703-4167-b85f-78313fcf5aee-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jmggq\" (UID: \"0c59a220-4703-4167-b85f-78313fcf5aee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.691453 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0c59a220-4703-4167-b85f-78313fcf5aee-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jmggq\" (UID: \"0c59a220-4703-4167-b85f-78313fcf5aee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.691474 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c59a220-4703-4167-b85f-78313fcf5aee-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jmggq\" (UID: \"0c59a220-4703-4167-b85f-78313fcf5aee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.691506 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c59a220-4703-4167-b85f-78313fcf5aee-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jmggq\" (UID: \"0c59a220-4703-4167-b85f-78313fcf5aee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.739138 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podStartSLOduration=74.73911884 podStartE2EDuration="1m14.73911884s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:00:17.727865337 +0000 UTC m=+93.991090359" watchObservedRunningTime="2025-12-08 09:00:17.73911884 +0000 UTC m=+94.002343862" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.748418 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8k6qx" podStartSLOduration=74.748399559 podStartE2EDuration="1m14.748399559s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:00:17.748017169 +0000 UTC m=+94.011242191" watchObservedRunningTime="2025-12-08 09:00:17.748399559 +0000 UTC m=+94.011624591" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.748539 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=46.748535784 podStartE2EDuration="46.748535784s" podCreationTimestamp="2025-12-08 08:59:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:00:17.739780827 +0000 UTC m=+94.003005859" watchObservedRunningTime="2025-12-08 09:00:17.748535784 +0000 UTC m=+94.011760806" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.771915 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q85ld" podStartSLOduration=74.771898082 podStartE2EDuration="1m14.771898082s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:00:17.760129696 +0000 UTC m=+94.023354718" watchObservedRunningTime="2025-12-08 09:00:17.771898082 +0000 UTC m=+94.035123104" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.792943 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c59a220-4703-4167-b85f-78313fcf5aee-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jmggq\" (UID: \"0c59a220-4703-4167-b85f-78313fcf5aee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.793026 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c59a220-4703-4167-b85f-78313fcf5aee-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jmggq\" (UID: \"0c59a220-4703-4167-b85f-78313fcf5aee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.793102 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0c59a220-4703-4167-b85f-78313fcf5aee-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jmggq\" (UID: \"0c59a220-4703-4167-b85f-78313fcf5aee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.793135 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0c59a220-4703-4167-b85f-78313fcf5aee-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jmggq\" (UID: \"0c59a220-4703-4167-b85f-78313fcf5aee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.793165 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c59a220-4703-4167-b85f-78313fcf5aee-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jmggq\" (UID: \"0c59a220-4703-4167-b85f-78313fcf5aee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.793564 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0c59a220-4703-4167-b85f-78313fcf5aee-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jmggq\" (UID: \"0c59a220-4703-4167-b85f-78313fcf5aee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.793870 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0c59a220-4703-4167-b85f-78313fcf5aee-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jmggq\" (UID: \"0c59a220-4703-4167-b85f-78313fcf5aee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.793969 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c59a220-4703-4167-b85f-78313fcf5aee-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jmggq\" (UID: \"0c59a220-4703-4167-b85f-78313fcf5aee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.806677 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=71.806656727 podStartE2EDuration="1m11.806656727s" podCreationTimestamp="2025-12-08 08:59:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:00:17.800875892 +0000 UTC m=+94.064100934" watchObservedRunningTime="2025-12-08 09:00:17.806656727 +0000 UTC m=+94.069881749" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.807316 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c59a220-4703-4167-b85f-78313fcf5aee-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jmggq\" (UID: \"0c59a220-4703-4167-b85f-78313fcf5aee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.817476 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c59a220-4703-4167-b85f-78313fcf5aee-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jmggq\" (UID: \"0c59a220-4703-4167-b85f-78313fcf5aee\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.853069 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5x9ft" podStartSLOduration=74.853049887 podStartE2EDuration="1m14.853049887s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:00:17.84089523 +0000 UTC m=+94.104120252" watchObservedRunningTime="2025-12-08 09:00:17.853049887 +0000 UTC m=+94.116274919" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.860909 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fdg6t" podStartSLOduration=74.860885858 podStartE2EDuration="1m14.860885858s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:00:17.86059012 +0000 UTC m=+94.123815162" watchObservedRunningTime="2025-12-08 09:00:17.860885858 +0000 UTC m=+94.124110880" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.871747 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-555j6" podStartSLOduration=74.87172855 podStartE2EDuration="1m14.87172855s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:00:17.871704079 +0000 UTC m=+94.134929101" watchObservedRunningTime="2025-12-08 09:00:17.87172855 +0000 UTC m=+94.134953572" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.906114 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=19.906096464 podStartE2EDuration="19.906096464s" podCreationTimestamp="2025-12-08 08:59:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:00:17.905987952 +0000 UTC m=+94.169212974" watchObservedRunningTime="2025-12-08 09:00:17.906096464 +0000 UTC m=+94.169321486" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.919246 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=73.919230067 podStartE2EDuration="1m13.919230067s" podCreationTimestamp="2025-12-08 08:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:00:17.919145855 +0000 UTC m=+94.182370877" watchObservedRunningTime="2025-12-08 09:00:17.919230067 +0000 UTC m=+94.182455089" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.933091 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=77.93307234 podStartE2EDuration="1m17.93307234s" podCreationTimestamp="2025-12-08 08:59:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:00:17.932860354 +0000 UTC m=+94.196085376" watchObservedRunningTime="2025-12-08 09:00:17.93307234 +0000 UTC m=+94.196297362" Dec 08 09:00:17 crc kubenswrapper[4776]: I1208 09:00:17.974289 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" Dec 08 09:00:18 crc kubenswrapper[4776]: I1208 09:00:18.343422 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:18 crc kubenswrapper[4776]: I1208 09:00:18.343531 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:18 crc kubenswrapper[4776]: E1208 09:00:18.343696 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:18 crc kubenswrapper[4776]: E1208 09:00:18.343851 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:18 crc kubenswrapper[4776]: I1208 09:00:18.901857 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" event={"ID":"0c59a220-4703-4167-b85f-78313fcf5aee","Type":"ContainerStarted","Data":"7cf5f48819007200f252026d5610c416c85cb9101f631cf52007a9544f1a0ef3"} Dec 08 09:00:18 crc kubenswrapper[4776]: I1208 09:00:18.901921 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" event={"ID":"0c59a220-4703-4167-b85f-78313fcf5aee","Type":"ContainerStarted","Data":"24648ed6cec3ff23860a046af19007ecd08b8c9ee4b11c916f5e8af7cf3d9e40"} Dec 08 09:00:18 crc kubenswrapper[4776]: I1208 09:00:18.915698 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmggq" podStartSLOduration=75.915675779 podStartE2EDuration="1m15.915675779s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:00:18.915153305 +0000 UTC m=+95.178378387" watchObservedRunningTime="2025-12-08 09:00:18.915675779 +0000 UTC m=+95.178900801" Dec 08 09:00:19 crc kubenswrapper[4776]: I1208 09:00:19.342904 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:19 crc kubenswrapper[4776]: E1208 09:00:19.343043 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:19 crc kubenswrapper[4776]: I1208 09:00:19.343136 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:19 crc kubenswrapper[4776]: E1208 09:00:19.343657 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:19 crc kubenswrapper[4776]: I1208 09:00:19.343896 4776 scope.go:117] "RemoveContainer" containerID="b728069c5c670cfef1888e64d211dfcfefb2de8c9ea9cf0a346c4538578b557e" Dec 08 09:00:19 crc kubenswrapper[4776]: E1208 09:00:19.344094 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\"" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" Dec 08 09:00:20 crc kubenswrapper[4776]: I1208 09:00:20.343293 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:20 crc kubenswrapper[4776]: I1208 09:00:20.343580 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:20 crc kubenswrapper[4776]: E1208 09:00:20.343711 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:20 crc kubenswrapper[4776]: E1208 09:00:20.343942 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:21 crc kubenswrapper[4776]: I1208 09:00:21.342788 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:21 crc kubenswrapper[4776]: E1208 09:00:21.342925 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:21 crc kubenswrapper[4776]: I1208 09:00:21.343008 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:21 crc kubenswrapper[4776]: E1208 09:00:21.343135 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:21 crc kubenswrapper[4776]: I1208 09:00:21.836461 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs\") pod \"network-metrics-daemon-kkhjg\" (UID: \"99143b9c-a541-4c0e-8387-0dff0d557974\") " pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:21 crc kubenswrapper[4776]: E1208 09:00:21.836626 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 09:00:21 crc kubenswrapper[4776]: E1208 09:00:21.836721 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs podName:99143b9c-a541-4c0e-8387-0dff0d557974 nodeName:}" failed. No retries permitted until 2025-12-08 09:01:25.836694663 +0000 UTC m=+162.099919765 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs") pod "network-metrics-daemon-kkhjg" (UID: "99143b9c-a541-4c0e-8387-0dff0d557974") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 09:00:22 crc kubenswrapper[4776]: I1208 09:00:22.343447 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:22 crc kubenswrapper[4776]: E1208 09:00:22.343646 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:22 crc kubenswrapper[4776]: I1208 09:00:22.344381 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:22 crc kubenswrapper[4776]: E1208 09:00:22.344896 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:23 crc kubenswrapper[4776]: I1208 09:00:23.343303 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:23 crc kubenswrapper[4776]: I1208 09:00:23.343977 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:23 crc kubenswrapper[4776]: E1208 09:00:23.344126 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:23 crc kubenswrapper[4776]: E1208 09:00:23.344198 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:24 crc kubenswrapper[4776]: I1208 09:00:24.342868 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:24 crc kubenswrapper[4776]: E1208 09:00:24.343897 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:24 crc kubenswrapper[4776]: I1208 09:00:24.344050 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:24 crc kubenswrapper[4776]: E1208 09:00:24.344093 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:25 crc kubenswrapper[4776]: I1208 09:00:25.342820 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:25 crc kubenswrapper[4776]: E1208 09:00:25.342963 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:25 crc kubenswrapper[4776]: I1208 09:00:25.343161 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:25 crc kubenswrapper[4776]: E1208 09:00:25.343246 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:26 crc kubenswrapper[4776]: I1208 09:00:26.342924 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:26 crc kubenswrapper[4776]: E1208 09:00:26.343082 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:26 crc kubenswrapper[4776]: I1208 09:00:26.343527 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:26 crc kubenswrapper[4776]: E1208 09:00:26.343648 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:27 crc kubenswrapper[4776]: I1208 09:00:27.342974 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:27 crc kubenswrapper[4776]: E1208 09:00:27.343121 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:27 crc kubenswrapper[4776]: I1208 09:00:27.343413 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:27 crc kubenswrapper[4776]: E1208 09:00:27.343620 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:28 crc kubenswrapper[4776]: I1208 09:00:28.343591 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:28 crc kubenswrapper[4776]: I1208 09:00:28.343630 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:28 crc kubenswrapper[4776]: E1208 09:00:28.343744 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:28 crc kubenswrapper[4776]: E1208 09:00:28.343855 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:29 crc kubenswrapper[4776]: I1208 09:00:29.354462 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:29 crc kubenswrapper[4776]: I1208 09:00:29.354483 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:29 crc kubenswrapper[4776]: E1208 09:00:29.355699 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:29 crc kubenswrapper[4776]: E1208 09:00:29.356022 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:30 crc kubenswrapper[4776]: I1208 09:00:30.343300 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:30 crc kubenswrapper[4776]: E1208 09:00:30.343413 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:30 crc kubenswrapper[4776]: I1208 09:00:30.343572 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:30 crc kubenswrapper[4776]: E1208 09:00:30.343615 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:31 crc kubenswrapper[4776]: I1208 09:00:31.342950 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:31 crc kubenswrapper[4776]: I1208 09:00:31.342976 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:31 crc kubenswrapper[4776]: E1208 09:00:31.343098 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:31 crc kubenswrapper[4776]: E1208 09:00:31.343192 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:32 crc kubenswrapper[4776]: I1208 09:00:32.343710 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:32 crc kubenswrapper[4776]: I1208 09:00:32.343967 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:32 crc kubenswrapper[4776]: E1208 09:00:32.344391 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:32 crc kubenswrapper[4776]: E1208 09:00:32.345123 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:32 crc kubenswrapper[4776]: I1208 09:00:32.345558 4776 scope.go:117] "RemoveContainer" containerID="b728069c5c670cfef1888e64d211dfcfefb2de8c9ea9cf0a346c4538578b557e" Dec 08 09:00:32 crc kubenswrapper[4776]: E1208 09:00:32.345853 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-swbsc_openshift-ovn-kubernetes(1e518469-5b3b-4055-a0f0-075dc48b1c79)\"" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" Dec 08 09:00:33 crc kubenswrapper[4776]: I1208 09:00:33.343356 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:33 crc kubenswrapper[4776]: I1208 09:00:33.344003 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:33 crc kubenswrapper[4776]: E1208 09:00:33.344298 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:33 crc kubenswrapper[4776]: E1208 09:00:33.345139 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:34 crc kubenswrapper[4776]: I1208 09:00:34.343267 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:34 crc kubenswrapper[4776]: I1208 09:00:34.343374 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:34 crc kubenswrapper[4776]: E1208 09:00:34.346468 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:34 crc kubenswrapper[4776]: E1208 09:00:34.346610 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:35 crc kubenswrapper[4776]: I1208 09:00:35.343249 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:35 crc kubenswrapper[4776]: I1208 09:00:35.343300 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:35 crc kubenswrapper[4776]: E1208 09:00:35.343371 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:35 crc kubenswrapper[4776]: E1208 09:00:35.343474 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:36 crc kubenswrapper[4776]: I1208 09:00:36.342839 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:36 crc kubenswrapper[4776]: I1208 09:00:36.342898 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:36 crc kubenswrapper[4776]: E1208 09:00:36.342954 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:36 crc kubenswrapper[4776]: E1208 09:00:36.343026 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:37 crc kubenswrapper[4776]: I1208 09:00:37.342829 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:37 crc kubenswrapper[4776]: I1208 09:00:37.342875 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:37 crc kubenswrapper[4776]: E1208 09:00:37.342962 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:37 crc kubenswrapper[4776]: E1208 09:00:37.343065 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:38 crc kubenswrapper[4776]: I1208 09:00:38.342769 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:38 crc kubenswrapper[4776]: E1208 09:00:38.342877 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:38 crc kubenswrapper[4776]: I1208 09:00:38.343273 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:38 crc kubenswrapper[4776]: E1208 09:00:38.343400 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:38 crc kubenswrapper[4776]: I1208 09:00:38.960572 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-555j6_775b9e97-3ad5-4003-a2c2-fc8dd58b69cc/kube-multus/1.log" Dec 08 09:00:38 crc kubenswrapper[4776]: I1208 09:00:38.961021 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-555j6_775b9e97-3ad5-4003-a2c2-fc8dd58b69cc/kube-multus/0.log" Dec 08 09:00:38 crc kubenswrapper[4776]: I1208 09:00:38.961072 4776 generic.go:334] "Generic (PLEG): container finished" podID="775b9e97-3ad5-4003-a2c2-fc8dd58b69cc" containerID="bf6eb111dbbc6dec73baadf4e88ff08f03050658b7682b28c960ecdb80973eae" exitCode=1 Dec 08 09:00:38 crc kubenswrapper[4776]: I1208 09:00:38.961109 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-555j6" event={"ID":"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc","Type":"ContainerDied","Data":"bf6eb111dbbc6dec73baadf4e88ff08f03050658b7682b28c960ecdb80973eae"} Dec 08 09:00:38 crc kubenswrapper[4776]: I1208 09:00:38.961157 4776 scope.go:117] "RemoveContainer" containerID="04b00e982993068dd2f274ce749a8b994561657d124122969adbb79c53658ecf" Dec 08 09:00:38 crc kubenswrapper[4776]: I1208 09:00:38.961468 4776 scope.go:117] "RemoveContainer" containerID="bf6eb111dbbc6dec73baadf4e88ff08f03050658b7682b28c960ecdb80973eae" Dec 08 09:00:38 crc kubenswrapper[4776]: E1208 09:00:38.961626 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-555j6_openshift-multus(775b9e97-3ad5-4003-a2c2-fc8dd58b69cc)\"" pod="openshift-multus/multus-555j6" podUID="775b9e97-3ad5-4003-a2c2-fc8dd58b69cc" Dec 08 09:00:39 crc kubenswrapper[4776]: I1208 09:00:39.342903 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:39 crc kubenswrapper[4776]: I1208 09:00:39.342921 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:39 crc kubenswrapper[4776]: E1208 09:00:39.343008 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:39 crc kubenswrapper[4776]: E1208 09:00:39.343161 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:39 crc kubenswrapper[4776]: I1208 09:00:39.967505 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-555j6_775b9e97-3ad5-4003-a2c2-fc8dd58b69cc/kube-multus/1.log" Dec 08 09:00:40 crc kubenswrapper[4776]: I1208 09:00:40.343717 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:40 crc kubenswrapper[4776]: I1208 09:00:40.343765 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:40 crc kubenswrapper[4776]: E1208 09:00:40.343875 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:40 crc kubenswrapper[4776]: E1208 09:00:40.344009 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:41 crc kubenswrapper[4776]: I1208 09:00:41.343539 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:41 crc kubenswrapper[4776]: I1208 09:00:41.343557 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:41 crc kubenswrapper[4776]: E1208 09:00:41.343762 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:41 crc kubenswrapper[4776]: E1208 09:00:41.343874 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:42 crc kubenswrapper[4776]: I1208 09:00:42.343199 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:42 crc kubenswrapper[4776]: I1208 09:00:42.343337 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:42 crc kubenswrapper[4776]: E1208 09:00:42.343497 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:42 crc kubenswrapper[4776]: E1208 09:00:42.343617 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:43 crc kubenswrapper[4776]: I1208 09:00:43.343449 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:43 crc kubenswrapper[4776]: I1208 09:00:43.343565 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:43 crc kubenswrapper[4776]: E1208 09:00:43.343614 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:43 crc kubenswrapper[4776]: E1208 09:00:43.343727 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:43 crc kubenswrapper[4776]: E1208 09:00:43.959602 4776 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 08 09:00:44 crc kubenswrapper[4776]: I1208 09:00:44.343242 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:44 crc kubenswrapper[4776]: E1208 09:00:44.344458 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:44 crc kubenswrapper[4776]: I1208 09:00:44.344567 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:44 crc kubenswrapper[4776]: E1208 09:00:44.344676 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:44 crc kubenswrapper[4776]: E1208 09:00:44.445973 4776 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 08 09:00:45 crc kubenswrapper[4776]: I1208 09:00:45.343705 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:45 crc kubenswrapper[4776]: I1208 09:00:45.343782 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:45 crc kubenswrapper[4776]: E1208 09:00:45.343913 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:45 crc kubenswrapper[4776]: E1208 09:00:45.344037 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:46 crc kubenswrapper[4776]: I1208 09:00:46.343553 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:46 crc kubenswrapper[4776]: E1208 09:00:46.343707 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:46 crc kubenswrapper[4776]: I1208 09:00:46.343759 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:46 crc kubenswrapper[4776]: E1208 09:00:46.343898 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:47 crc kubenswrapper[4776]: I1208 09:00:47.342601 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:47 crc kubenswrapper[4776]: E1208 09:00:47.342737 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:47 crc kubenswrapper[4776]: I1208 09:00:47.342609 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:47 crc kubenswrapper[4776]: E1208 09:00:47.343167 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:47 crc kubenswrapper[4776]: I1208 09:00:47.343511 4776 scope.go:117] "RemoveContainer" containerID="b728069c5c670cfef1888e64d211dfcfefb2de8c9ea9cf0a346c4538578b557e" Dec 08 09:00:47 crc kubenswrapper[4776]: I1208 09:00:47.994341 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovnkube-controller/3.log" Dec 08 09:00:47 crc kubenswrapper[4776]: I1208 09:00:47.996981 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerStarted","Data":"e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0"} Dec 08 09:00:47 crc kubenswrapper[4776]: I1208 09:00:47.997434 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 09:00:48 crc kubenswrapper[4776]: I1208 09:00:48.021953 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podStartSLOduration=105.021938548 podStartE2EDuration="1m45.021938548s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:00:48.020057287 +0000 UTC m=+124.283282319" watchObservedRunningTime="2025-12-08 09:00:48.021938548 +0000 UTC m=+124.285163570" Dec 08 09:00:48 crc kubenswrapper[4776]: I1208 09:00:48.068315 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kkhjg"] Dec 08 09:00:48 crc kubenswrapper[4776]: I1208 09:00:48.068442 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:48 crc kubenswrapper[4776]: E1208 09:00:48.068538 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:48 crc kubenswrapper[4776]: I1208 09:00:48.343342 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:48 crc kubenswrapper[4776]: I1208 09:00:48.343348 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:48 crc kubenswrapper[4776]: E1208 09:00:48.343516 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:48 crc kubenswrapper[4776]: E1208 09:00:48.343588 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:49 crc kubenswrapper[4776]: I1208 09:00:49.342715 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:49 crc kubenswrapper[4776]: I1208 09:00:49.342715 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:49 crc kubenswrapper[4776]: E1208 09:00:49.342894 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:49 crc kubenswrapper[4776]: E1208 09:00:49.343042 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:49 crc kubenswrapper[4776]: E1208 09:00:49.447219 4776 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 08 09:00:50 crc kubenswrapper[4776]: I1208 09:00:50.343532 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:50 crc kubenswrapper[4776]: I1208 09:00:50.343654 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:50 crc kubenswrapper[4776]: E1208 09:00:50.343690 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:50 crc kubenswrapper[4776]: E1208 09:00:50.343857 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:51 crc kubenswrapper[4776]: I1208 09:00:51.429367 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:51 crc kubenswrapper[4776]: E1208 09:00:51.429513 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:51 crc kubenswrapper[4776]: I1208 09:00:51.429692 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:51 crc kubenswrapper[4776]: E1208 09:00:51.429742 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:52 crc kubenswrapper[4776]: I1208 09:00:52.343259 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:52 crc kubenswrapper[4776]: E1208 09:00:52.343407 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:52 crc kubenswrapper[4776]: I1208 09:00:52.343462 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:52 crc kubenswrapper[4776]: E1208 09:00:52.343655 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:53 crc kubenswrapper[4776]: I1208 09:00:53.343040 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:53 crc kubenswrapper[4776]: I1208 09:00:53.343126 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:53 crc kubenswrapper[4776]: E1208 09:00:53.343256 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:53 crc kubenswrapper[4776]: E1208 09:00:53.343348 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:54 crc kubenswrapper[4776]: I1208 09:00:54.343479 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:54 crc kubenswrapper[4776]: I1208 09:00:54.344578 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:54 crc kubenswrapper[4776]: E1208 09:00:54.344686 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:54 crc kubenswrapper[4776]: I1208 09:00:54.344787 4776 scope.go:117] "RemoveContainer" containerID="bf6eb111dbbc6dec73baadf4e88ff08f03050658b7682b28c960ecdb80973eae" Dec 08 09:00:54 crc kubenswrapper[4776]: E1208 09:00:54.344888 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:54 crc kubenswrapper[4776]: E1208 09:00:54.448154 4776 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 08 09:00:55 crc kubenswrapper[4776]: I1208 09:00:55.343538 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:55 crc kubenswrapper[4776]: I1208 09:00:55.343602 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:55 crc kubenswrapper[4776]: E1208 09:00:55.345027 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:55 crc kubenswrapper[4776]: E1208 09:00:55.345160 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:55 crc kubenswrapper[4776]: I1208 09:00:55.449705 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-555j6_775b9e97-3ad5-4003-a2c2-fc8dd58b69cc/kube-multus/1.log" Dec 08 09:00:55 crc kubenswrapper[4776]: I1208 09:00:55.449765 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-555j6" event={"ID":"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc","Type":"ContainerStarted","Data":"69d2876b5cbb01bb020eec751d903bc19a2687f73ca0e18de2aaf643d15143d7"} Dec 08 09:00:56 crc kubenswrapper[4776]: I1208 09:00:56.343000 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:56 crc kubenswrapper[4776]: I1208 09:00:56.343007 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:56 crc kubenswrapper[4776]: E1208 09:00:56.343215 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:56 crc kubenswrapper[4776]: E1208 09:00:56.343361 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:57 crc kubenswrapper[4776]: I1208 09:00:57.343103 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:57 crc kubenswrapper[4776]: I1208 09:00:57.343224 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:57 crc kubenswrapper[4776]: E1208 09:00:57.343260 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:00:57 crc kubenswrapper[4776]: E1208 09:00:57.343450 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:58 crc kubenswrapper[4776]: I1208 09:00:58.343701 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:00:58 crc kubenswrapper[4776]: I1208 09:00:58.343840 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:00:58 crc kubenswrapper[4776]: E1208 09:00:58.343923 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 09:00:58 crc kubenswrapper[4776]: E1208 09:00:58.343987 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 09:00:59 crc kubenswrapper[4776]: I1208 09:00:59.343660 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:00:59 crc kubenswrapper[4776]: I1208 09:00:59.343755 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:00:59 crc kubenswrapper[4776]: E1208 09:00:59.343905 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 09:00:59 crc kubenswrapper[4776]: E1208 09:00:59.344026 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kkhjg" podUID="99143b9c-a541-4c0e-8387-0dff0d557974" Dec 08 09:01:00 crc kubenswrapper[4776]: I1208 09:01:00.343195 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:01:00 crc kubenswrapper[4776]: I1208 09:01:00.343486 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:01:00 crc kubenswrapper[4776]: I1208 09:01:00.346318 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 08 09:01:00 crc kubenswrapper[4776]: I1208 09:01:00.346468 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 08 09:01:00 crc kubenswrapper[4776]: I1208 09:01:00.346848 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 08 09:01:00 crc kubenswrapper[4776]: I1208 09:01:00.347668 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 08 09:01:01 crc kubenswrapper[4776]: I1208 09:01:01.343156 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:01:01 crc kubenswrapper[4776]: I1208 09:01:01.343250 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:01:01 crc kubenswrapper[4776]: I1208 09:01:01.346264 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 08 09:01:01 crc kubenswrapper[4776]: I1208 09:01:01.347246 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.630239 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.676265 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bk9qw"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.677564 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.677716 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ld6f6"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.679213 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dndwl"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.679339 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.679979 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-dndwl" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.680641 4776 reflector.go:561] object-"openshift-apiserver"/"audit-1": failed to list *v1.ConfigMap: configmaps "audit-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.680707 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"audit-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.680758 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b"] Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.680846 4776 reflector.go:561] object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff": failed to list *v1.Secret: secrets "openshift-apiserver-sa-dockercfg-djjff" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.680894 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-djjff\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-sa-dockercfg-djjff\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.681266 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.682351 4776 reflector.go:561] object-"openshift-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.682418 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.682684 4776 reflector.go:561] object-"openshift-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.682727 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.682852 4776 reflector.go:561] object-"openshift-console-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console-operator": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.682885 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-console-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.688660 4776 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.688726 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.688840 4776 reflector.go:561] object-"openshift-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.688875 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.689099 4776 reflector.go:561] object-"openshift-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.689134 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.689253 4776 reflector.go:561] object-"openshift-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.689288 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.689394 4776 reflector.go:561] object-"openshift-controller-manager"/"openshift-global-ca": failed to list *v1.ConfigMap: configmaps "openshift-global-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.689428 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-global-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-global-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.689507 4776 reflector.go:561] object-"openshift-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.689539 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.690145 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.691858 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jxkd8"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.692239 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.692629 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.692809 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.693403 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.696423 4776 reflector.go:561] object-"openshift-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.696484 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.696606 4776 reflector.go:561] object-"openshift-route-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.696633 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.696701 4776 reflector.go:561] object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr": failed to list *v1.Secret: secrets "console-operator-dockercfg-4xjcr" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-console-operator": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.696725 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-console-operator\"/\"console-operator-dockercfg-4xjcr\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"console-operator-dockercfg-4xjcr\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.696795 4776 reflector.go:561] object-"openshift-console-operator"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-console-operator": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.696823 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-console-operator\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.697085 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.709691 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.723762 4776 reflector.go:561] object-"openshift-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.723862 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.724007 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.724270 4776 reflector.go:561] object-"openshift-route-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.724287 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.726789 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.726874 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.726900 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.726951 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.727085 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.727093 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.727157 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.727215 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.731606 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.734896 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.748026 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.748267 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.748694 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.748734 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.748735 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2hsh8"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.748817 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 08 09:01:08 crc kubenswrapper[4776]: W1208 09:01:08.748926 4776 reflector.go:561] object-"openshift-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Dec 08 09:01:08 crc kubenswrapper[4776]: E1208 09:01:08.748955 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.749027 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.749132 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.749197 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.749261 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.749385 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fm86d"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.749859 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.751026 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.751918 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.752682 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.754141 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.755611 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.755318 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.756160 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.755403 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.756788 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-8dm9l"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.759446 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.759903 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.760224 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.761204 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p5sqv"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.761604 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.763021 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-559sf"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.763436 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-559sf" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.765358 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.768318 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.769150 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-txnxn"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.769606 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.770038 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.770634 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ld6f6"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.770643 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.781605 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mbv9b"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.782275 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.785787 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.791629 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.791875 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.792402 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.792784 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.796373 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.796536 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.796866 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.797254 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.798502 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2jxng"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.799080 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-2jxng" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.799466 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.800153 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.801329 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.801633 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.806747 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e981fb43-6f44-4462-b97c-f64658cd7c97-etcd-client\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.806791 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-trusted-ca-bundle\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.806817 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-config\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.806850 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.806872 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/178bd27c-e3da-4218-9785-9d7c8b1bf89a-service-ca-bundle\") pod \"authentication-operator-69f744f599-fm86d\" (UID: \"178bd27c-e3da-4218-9785-9d7c8b1bf89a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.806887 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-image-import-ca\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.806903 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.806920 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kthb8\" (UniqueName: \"kubernetes.io/projected/0da3b83e-efc3-4e6d-b876-186f430d3d77-kube-api-access-kthb8\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.806939 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e981fb43-6f44-4462-b97c-f64658cd7c97-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.806953 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-config\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.806968 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-config\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.806982 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-service-ca\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.806996 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0da3b83e-efc3-4e6d-b876-186f430d3d77-serving-cert\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807011 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-etcd-serving-ca\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807025 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2wx9\" (UniqueName: \"kubernetes.io/projected/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-kube-api-access-x2wx9\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807041 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/178bd27c-e3da-4218-9785-9d7c8b1bf89a-config\") pod \"authentication-operator-69f744f599-fm86d\" (UID: \"178bd27c-e3da-4218-9785-9d7c8b1bf89a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807057 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9njh\" (UniqueName: \"kubernetes.io/projected/c0bf1894-515b-4ae6-bcf5-148f5db59022-kube-api-access-f9njh\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807079 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c0bf1894-515b-4ae6-bcf5-148f5db59022-node-pullsecrets\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807096 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-oauth-config\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807117 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e981fb43-6f44-4462-b97c-f64658cd7c97-serving-cert\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807134 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-encryption-config\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807151 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c04832-2cf3-4401-bf58-b2b5624e5c97-serving-cert\") pod \"route-controller-manager-6576b87f9c-9ll9b\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807167 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c51c82-c887-4c77-bfa2-cb3c5e896751-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vld4h\" (UID: \"e7c51c82-c887-4c77-bfa2-cb3c5e896751\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807209 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p558p\" (UniqueName: \"kubernetes.io/projected/d6d2a3a0-669f-41c3-8a04-1a4f7f961f1b-kube-api-access-p558p\") pod \"downloads-7954f5f757-559sf\" (UID: \"d6d2a3a0-669f-41c3-8a04-1a4f7f961f1b\") " pod="openshift-console/downloads-7954f5f757-559sf" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807225 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-config\") pod \"route-controller-manager-6576b87f9c-9ll9b\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807239 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/178bd27c-e3da-4218-9785-9d7c8b1bf89a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fm86d\" (UID: \"178bd27c-e3da-4218-9785-9d7c8b1bf89a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807252 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c0bf1894-515b-4ae6-bcf5-148f5db59022-audit-dir\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807266 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e981fb43-6f44-4462-b97c-f64658cd7c97-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807282 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ea3906e-d311-4b90-80be-7405507e135e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jxkd8\" (UID: \"5ea3906e-d311-4b90-80be-7405507e135e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807297 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e981fb43-6f44-4462-b97c-f64658cd7c97-encryption-config\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807310 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-client-ca\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807329 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5ea3906e-d311-4b90-80be-7405507e135e-images\") pod \"machine-api-operator-5694c8668f-jxkd8\" (UID: \"5ea3906e-d311-4b90-80be-7405507e135e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807346 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-serving-cert\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807361 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f6da64f0-d985-46de-bffa-4ae9632c0245-available-featuregates\") pod \"openshift-config-operator-7777fb866f-txnxn\" (UID: \"f6da64f0-d985-46de-bffa-4ae9632c0245\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807379 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807913 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ckhmx"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.808293 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jf2nl"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.808671 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.807386 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzrv4\" (UniqueName: \"kubernetes.io/projected/e7c51c82-c887-4c77-bfa2-cb3c5e896751-kube-api-access-tzrv4\") pod \"openshift-apiserver-operator-796bbdcf4f-vld4h\" (UID: \"e7c51c82-c887-4c77-bfa2-cb3c5e896751\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.810530 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thkdq\" (UniqueName: \"kubernetes.io/projected/e981fb43-6f44-4462-b97c-f64658cd7c97-kube-api-access-thkdq\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.810559 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-audit\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.810576 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-serving-cert\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.810632 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c72nd\" (UniqueName: \"kubernetes.io/projected/5ea3906e-d311-4b90-80be-7405507e135e-kube-api-access-c72nd\") pod \"machine-api-operator-5694c8668f-jxkd8\" (UID: \"5ea3906e-d311-4b90-80be-7405507e135e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.810651 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdnk7\" (UniqueName: \"kubernetes.io/projected/c2c04832-2cf3-4401-bf58-b2b5624e5c97-kube-api-access-bdnk7\") pod \"route-controller-manager-6576b87f9c-9ll9b\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.810665 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c51c82-c887-4c77-bfa2-cb3c5e896751-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vld4h\" (UID: \"e7c51c82-c887-4c77-bfa2-cb3c5e896751\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.810707 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ea3906e-d311-4b90-80be-7405507e135e-config\") pod \"machine-api-operator-5694c8668f-jxkd8\" (UID: \"5ea3906e-d311-4b90-80be-7405507e135e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.816885 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.817108 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckhmx" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.818140 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dndwl"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.818865 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.819482 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.819630 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hzscb"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.820013 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hzscb" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.820794 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvcs6\" (UniqueName: \"kubernetes.io/projected/178bd27c-e3da-4218-9785-9d7c8b1bf89a-kube-api-access-hvcs6\") pod \"authentication-operator-69f744f599-fm86d\" (UID: \"178bd27c-e3da-4218-9785-9d7c8b1bf89a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.820884 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6da64f0-d985-46de-bffa-4ae9632c0245-serving-cert\") pod \"openshift-config-operator-7777fb866f-txnxn\" (UID: \"f6da64f0-d985-46de-bffa-4ae9632c0245\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.820906 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/178bd27c-e3da-4218-9785-9d7c8b1bf89a-serving-cert\") pod \"authentication-operator-69f744f599-fm86d\" (UID: \"178bd27c-e3da-4218-9785-9d7c8b1bf89a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.820929 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e981fb43-6f44-4462-b97c-f64658cd7c97-audit-dir\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.820986 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-client-ca\") pod \"route-controller-manager-6576b87f9c-9ll9b\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.821002 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-oauth-serving-cert\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.821023 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e981fb43-6f44-4462-b97c-f64658cd7c97-audit-policies\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.821039 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzlzj\" (UniqueName: \"kubernetes.io/projected/f6da64f0-d985-46de-bffa-4ae9632c0245-kube-api-access-rzlzj\") pod \"openshift-config-operator-7777fb866f-txnxn\" (UID: \"f6da64f0-d985-46de-bffa-4ae9632c0245\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.821052 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-etcd-client\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.825526 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-szjk8"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.831427 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.832094 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-xh8d5"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.833018 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.833648 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-szjk8" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.834832 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.853725 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.853963 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.857063 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.858011 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.860907 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-njv2h"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.861404 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.861863 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.862051 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-njv2h" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.878342 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.878382 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.878757 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.879254 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.879403 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.879536 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.879793 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.879913 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.880781 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-srht8"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.880820 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.881754 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-srht8" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.881845 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.882430 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.890456 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.891112 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.891352 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.897903 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.898461 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.898606 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.898686 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.898755 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.898827 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.898903 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.898604 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.898429 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.899067 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.899162 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.899262 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.899368 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.899984 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.900850 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.899399 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.901255 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.899489 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.899549 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.901465 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.899632 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.899661 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.899697 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.899726 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.899755 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.899789 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.898378 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.899963 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.901987 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.902130 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.904027 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.904537 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.904631 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.904789 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.905219 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.905384 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.905503 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.905696 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.906791 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.907085 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.907153 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.907287 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.908506 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5gpv8"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.908962 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5gpv8" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.909160 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-559sf"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.913930 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jxkd8"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.915911 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2jxng"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.939471 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.940825 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.941866 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e981fb43-6f44-4462-b97c-f64658cd7c97-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.944165 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.944381 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bk9qw"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.943016 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e981fb43-6f44-4462-b97c-f64658cd7c97-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.945723 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-config\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.944406 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.945827 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-config\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.945851 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-service-ca\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.945877 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0da3b83e-efc3-4e6d-b876-186f430d3d77-serving-cert\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948221 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948434 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-etcd-serving-ca\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948483 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2wx9\" (UniqueName: \"kubernetes.io/projected/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-kube-api-access-x2wx9\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948506 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/178bd27c-e3da-4218-9785-9d7c8b1bf89a-config\") pod \"authentication-operator-69f744f599-fm86d\" (UID: \"178bd27c-e3da-4218-9785-9d7c8b1bf89a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948522 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9njh\" (UniqueName: \"kubernetes.io/projected/c0bf1894-515b-4ae6-bcf5-148f5db59022-kube-api-access-f9njh\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948545 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c0bf1894-515b-4ae6-bcf5-148f5db59022-node-pullsecrets\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948561 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-oauth-config\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948581 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e981fb43-6f44-4462-b97c-f64658cd7c97-serving-cert\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948595 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-encryption-config\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948614 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c04832-2cf3-4401-bf58-b2b5624e5c97-serving-cert\") pod \"route-controller-manager-6576b87f9c-9ll9b\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948631 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c51c82-c887-4c77-bfa2-cb3c5e896751-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vld4h\" (UID: \"e7c51c82-c887-4c77-bfa2-cb3c5e896751\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948660 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p558p\" (UniqueName: \"kubernetes.io/projected/d6d2a3a0-669f-41c3-8a04-1a4f7f961f1b-kube-api-access-p558p\") pod \"downloads-7954f5f757-559sf\" (UID: \"d6d2a3a0-669f-41c3-8a04-1a4f7f961f1b\") " pod="openshift-console/downloads-7954f5f757-559sf" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948675 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-config\") pod \"route-controller-manager-6576b87f9c-9ll9b\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948689 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/178bd27c-e3da-4218-9785-9d7c8b1bf89a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fm86d\" (UID: \"178bd27c-e3da-4218-9785-9d7c8b1bf89a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948706 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c0bf1894-515b-4ae6-bcf5-148f5db59022-audit-dir\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948724 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e981fb43-6f44-4462-b97c-f64658cd7c97-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948739 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ea3906e-d311-4b90-80be-7405507e135e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jxkd8\" (UID: \"5ea3906e-d311-4b90-80be-7405507e135e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948756 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e981fb43-6f44-4462-b97c-f64658cd7c97-encryption-config\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948771 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-client-ca\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948788 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5ea3906e-d311-4b90-80be-7405507e135e-images\") pod \"machine-api-operator-5694c8668f-jxkd8\" (UID: \"5ea3906e-d311-4b90-80be-7405507e135e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948810 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-serving-cert\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948826 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f6da64f0-d985-46de-bffa-4ae9632c0245-available-featuregates\") pod \"openshift-config-operator-7777fb866f-txnxn\" (UID: \"f6da64f0-d985-46de-bffa-4ae9632c0245\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948876 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzrv4\" (UniqueName: \"kubernetes.io/projected/e7c51c82-c887-4c77-bfa2-cb3c5e896751-kube-api-access-tzrv4\") pod \"openshift-apiserver-operator-796bbdcf4f-vld4h\" (UID: \"e7c51c82-c887-4c77-bfa2-cb3c5e896751\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948893 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thkdq\" (UniqueName: \"kubernetes.io/projected/e981fb43-6f44-4462-b97c-f64658cd7c97-kube-api-access-thkdq\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948910 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-audit\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948929 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-serving-cert\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948947 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c72nd\" (UniqueName: \"kubernetes.io/projected/5ea3906e-d311-4b90-80be-7405507e135e-kube-api-access-c72nd\") pod \"machine-api-operator-5694c8668f-jxkd8\" (UID: \"5ea3906e-d311-4b90-80be-7405507e135e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948963 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdnk7\" (UniqueName: \"kubernetes.io/projected/c2c04832-2cf3-4401-bf58-b2b5624e5c97-kube-api-access-bdnk7\") pod \"route-controller-manager-6576b87f9c-9ll9b\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948977 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c51c82-c887-4c77-bfa2-cb3c5e896751-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vld4h\" (UID: \"e7c51c82-c887-4c77-bfa2-cb3c5e896751\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.948993 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ea3906e-d311-4b90-80be-7405507e135e-config\") pod \"machine-api-operator-5694c8668f-jxkd8\" (UID: \"5ea3906e-d311-4b90-80be-7405507e135e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.949009 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvcs6\" (UniqueName: \"kubernetes.io/projected/178bd27c-e3da-4218-9785-9d7c8b1bf89a-kube-api-access-hvcs6\") pod \"authentication-operator-69f744f599-fm86d\" (UID: \"178bd27c-e3da-4218-9785-9d7c8b1bf89a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.949035 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6da64f0-d985-46de-bffa-4ae9632c0245-serving-cert\") pod \"openshift-config-operator-7777fb866f-txnxn\" (UID: \"f6da64f0-d985-46de-bffa-4ae9632c0245\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.949050 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/178bd27c-e3da-4218-9785-9d7c8b1bf89a-serving-cert\") pod \"authentication-operator-69f744f599-fm86d\" (UID: \"178bd27c-e3da-4218-9785-9d7c8b1bf89a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.949070 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e981fb43-6f44-4462-b97c-f64658cd7c97-audit-dir\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.949094 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-client-ca\") pod \"route-controller-manager-6576b87f9c-9ll9b\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.949109 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-oauth-serving-cert\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.949128 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e981fb43-6f44-4462-b97c-f64658cd7c97-audit-policies\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.949144 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzlzj\" (UniqueName: \"kubernetes.io/projected/f6da64f0-d985-46de-bffa-4ae9632c0245-kube-api-access-rzlzj\") pod \"openshift-config-operator-7777fb866f-txnxn\" (UID: \"f6da64f0-d985-46de-bffa-4ae9632c0245\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.949159 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-etcd-client\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.949190 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e981fb43-6f44-4462-b97c-f64658cd7c97-etcd-client\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.949208 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-trusted-ca-bundle\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.949227 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-config\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.949259 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.949283 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/178bd27c-e3da-4218-9785-9d7c8b1bf89a-service-ca-bundle\") pod \"authentication-operator-69f744f599-fm86d\" (UID: \"178bd27c-e3da-4218-9785-9d7c8b1bf89a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.949303 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.950103 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-image-import-ca\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.950609 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-etcd-serving-ca\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.949300 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-image-import-ca\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.951185 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.951221 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kthb8\" (UniqueName: \"kubernetes.io/projected/0da3b83e-efc3-4e6d-b876-186f430d3d77-kube-api-access-kthb8\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.951267 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-config\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.951382 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-config\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.951553 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fm86d"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.951634 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.951796 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.951833 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/178bd27c-e3da-4218-9785-9d7c8b1bf89a-config\") pod \"authentication-operator-69f744f599-fm86d\" (UID: \"178bd27c-e3da-4218-9785-9d7c8b1bf89a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.952419 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c0bf1894-515b-4ae6-bcf5-148f5db59022-node-pullsecrets\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.953555 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-oauth-serving-cert\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.953858 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-service-ca\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.953977 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e981fb43-6f44-4462-b97c-f64658cd7c97-audit-policies\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.954430 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e981fb43-6f44-4462-b97c-f64658cd7c97-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.955318 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mbv9b"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.955371 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.955385 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8dm9l"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.959563 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e981fb43-6f44-4462-b97c-f64658cd7c97-etcd-client\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.959659 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.959765 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c04832-2cf3-4401-bf58-b2b5624e5c97-serving-cert\") pod \"route-controller-manager-6576b87f9c-9ll9b\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.961029 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2hsh8"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.961606 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.961840 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5ea3906e-d311-4b90-80be-7405507e135e-images\") pod \"machine-api-operator-5694c8668f-jxkd8\" (UID: \"5ea3906e-d311-4b90-80be-7405507e135e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.962153 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-trusted-ca-bundle\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.962623 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.962628 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c51c82-c887-4c77-bfa2-cb3c5e896751-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vld4h\" (UID: \"e7c51c82-c887-4c77-bfa2-cb3c5e896751\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.963203 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/178bd27c-e3da-4218-9785-9d7c8b1bf89a-service-ca-bundle\") pod \"authentication-operator-69f744f599-fm86d\" (UID: \"178bd27c-e3da-4218-9785-9d7c8b1bf89a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.963398 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c0bf1894-515b-4ae6-bcf5-148f5db59022-audit-dir\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.964286 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e981fb43-6f44-4462-b97c-f64658cd7c97-serving-cert\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.964355 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e981fb43-6f44-4462-b97c-f64658cd7c97-audit-dir\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.964620 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f6da64f0-d985-46de-bffa-4ae9632c0245-available-featuregates\") pod \"openshift-config-operator-7777fb866f-txnxn\" (UID: \"f6da64f0-d985-46de-bffa-4ae9632c0245\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.964829 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.964948 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-txnxn"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.965094 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.965225 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-config\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.965474 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-oauth-config\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.965577 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ea3906e-d311-4b90-80be-7405507e135e-config\") pod \"machine-api-operator-5694c8668f-jxkd8\" (UID: \"5ea3906e-d311-4b90-80be-7405507e135e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.965984 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/178bd27c-e3da-4218-9785-9d7c8b1bf89a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fm86d\" (UID: \"178bd27c-e3da-4218-9785-9d7c8b1bf89a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.966930 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.968524 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c51c82-c887-4c77-bfa2-cb3c5e896751-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vld4h\" (UID: \"e7c51c82-c887-4c77-bfa2-cb3c5e896751\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.968565 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ckhmx"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.968585 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.969244 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-serving-cert\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.969593 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jf2nl"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.970068 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ea3906e-d311-4b90-80be-7405507e135e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jxkd8\" (UID: \"5ea3906e-d311-4b90-80be-7405507e135e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.970570 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-njv2h"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.971846 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.972091 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e981fb43-6f44-4462-b97c-f64658cd7c97-encryption-config\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.972836 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.976213 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/178bd27c-e3da-4218-9785-9d7c8b1bf89a-serving-cert\") pod \"authentication-operator-69f744f599-fm86d\" (UID: \"178bd27c-e3da-4218-9785-9d7c8b1bf89a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.976977 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p5sqv"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.978083 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.978990 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.980010 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.980196 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6da64f0-d985-46de-bffa-4ae9632c0245-serving-cert\") pod \"openshift-config-operator-7777fb866f-txnxn\" (UID: \"f6da64f0-d985-46de-bffa-4ae9632c0245\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.981816 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.982162 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5gpv8"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.983949 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.985862 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b59h4"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.986815 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hzscb"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.986990 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b59h4" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.988199 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.989321 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.990624 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-szjk8"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.991638 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-srht8"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.993192 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-pgq6g"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.994314 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b59h4"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.994458 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pgq6g" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.994941 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-b2vfk"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.996064 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.997237 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-b2vfk"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.997797 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dzfmd"] Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.998374 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dzfmd" Dec 08 09:01:08 crc kubenswrapper[4776]: I1208 09:01:08.998846 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dzfmd"] Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.000492 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.019650 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.040510 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.060302 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.080590 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.101313 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.119597 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.139534 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.159836 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.180027 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.199744 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.219947 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.240496 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.260448 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.280392 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.299410 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.320484 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.339617 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.359948 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.380848 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.399847 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.419906 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.453209 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.459617 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.479779 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.500628 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.519387 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.551842 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.559397 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.579866 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.600142 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.621102 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.640527 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.661010 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.680544 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.700090 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.720627 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.740122 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.756813 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.759535 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.780145 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.800100 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.820300 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.838935 4776 request.go:700] Waited for 1.002682584s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/configmaps?fieldSelector=metadata.name%3Dservice-ca-bundle&limit=500&resourceVersion=0 Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.840613 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.859870 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.880093 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.900565 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.921151 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.940116 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.948748 4776 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.948826 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0da3b83e-efc3-4e6d-b876-186f430d3d77-serving-cert podName:0da3b83e-efc3-4e6d-b876-186f430d3d77 nodeName:}" failed. No retries permitted until 2025-12-08 09:01:10.448801994 +0000 UTC m=+146.712027016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0da3b83e-efc3-4e6d-b876-186f430d3d77-serving-cert") pod "controller-manager-879f6c89f-ld6f6" (UID: "0da3b83e-efc3-4e6d-b876-186f430d3d77") : failed to sync secret cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.952548 4776 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.952600 4776 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.952609 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-client-ca podName:c2c04832-2cf3-4401-bf58-b2b5624e5c97 nodeName:}" failed. No retries permitted until 2025-12-08 09:01:10.452594295 +0000 UTC m=+146.715819327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-client-ca") pod "route-controller-manager-6576b87f9c-9ll9b" (UID: "c2c04832-2cf3-4401-bf58-b2b5624e5c97") : failed to sync configmap cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.952647 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-trusted-ca-bundle podName:c0bf1894-515b-4ae6-bcf5-148f5db59022 nodeName:}" failed. No retries permitted until 2025-12-08 09:01:10.452636716 +0000 UTC m=+146.715861738 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-trusted-ca-bundle") pod "apiserver-76f77b778f-bk9qw" (UID: "c0bf1894-515b-4ae6-bcf5-148f5db59022") : failed to sync configmap cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.952657 4776 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.952726 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-audit podName:c0bf1894-515b-4ae6-bcf5-148f5db59022 nodeName:}" failed. No retries permitted until 2025-12-08 09:01:10.452707758 +0000 UTC m=+146.715932790 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-audit") pod "apiserver-76f77b778f-bk9qw" (UID: "c0bf1894-515b-4ae6-bcf5-148f5db59022") : failed to sync configmap cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.954976 4776 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.955013 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-etcd-client podName:c0bf1894-515b-4ae6-bcf5-148f5db59022 nodeName:}" failed. No retries permitted until 2025-12-08 09:01:10.455004558 +0000 UTC m=+146.718229580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-etcd-client") pod "apiserver-76f77b778f-bk9qw" (UID: "c0bf1894-515b-4ae6-bcf5-148f5db59022") : failed to sync secret cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.956678 4776 secret.go:188] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.956720 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-encryption-config podName:c0bf1894-515b-4ae6-bcf5-148f5db59022 nodeName:}" failed. No retries permitted until 2025-12-08 09:01:10.456710294 +0000 UTC m=+146.719935316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-encryption-config") pod "apiserver-76f77b778f-bk9qw" (UID: "c0bf1894-515b-4ae6-bcf5-148f5db59022") : failed to sync secret cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.958747 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.963632 4776 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.963725 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-proxy-ca-bundles podName:0da3b83e-efc3-4e6d-b876-186f430d3d77 nodeName:}" failed. No retries permitted until 2025-12-08 09:01:10.46370506 +0000 UTC m=+146.726930092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-proxy-ca-bundles") pod "controller-manager-879f6c89f-ld6f6" (UID: "0da3b83e-efc3-4e6d-b876-186f430d3d77") : failed to sync configmap cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.964317 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.964718 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.964804 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:03:11.964786568 +0000 UTC m=+268.228011610 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.964841 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.964904 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.965513 4776 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.965664 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-serving-cert podName:c0bf1894-515b-4ae6-bcf5-148f5db59022 nodeName:}" failed. No retries permitted until 2025-12-08 09:01:10.465587119 +0000 UTC m=+146.728812181 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-serving-cert") pod "apiserver-76f77b778f-bk9qw" (UID: "c0bf1894-515b-4ae6-bcf5-148f5db59022") : failed to sync secret cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.965696 4776 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.965757 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-config podName:c2c04832-2cf3-4401-bf58-b2b5624e5c97 nodeName:}" failed. No retries permitted until 2025-12-08 09:01:10.465739333 +0000 UTC m=+146.728964455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-config") pod "route-controller-manager-6576b87f9c-9ll9b" (UID: "c2c04832-2cf3-4401-bf58-b2b5624e5c97") : failed to sync configmap cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.965777 4776 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: E1208 09:01:09.965864 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-client-ca podName:0da3b83e-efc3-4e6d-b876-186f430d3d77 nodeName:}" failed. No retries permitted until 2025-12-08 09:01:10.465840686 +0000 UTC m=+146.729065768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-client-ca") pod "controller-manager-879f6c89f-ld6f6" (UID: "0da3b83e-efc3-4e6d-b876-186f430d3d77") : failed to sync configmap cache: timed out waiting for the condition Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.966526 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.971648 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.972379 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.980361 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 08 09:01:09 crc kubenswrapper[4776]: I1208 09:01:09.984056 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.000088 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.020141 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.039894 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.060763 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.061432 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.066439 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.079327 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.081970 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.104570 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.126057 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.139987 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.166984 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.179764 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.199767 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.220107 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.240381 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 08 09:01:10 crc kubenswrapper[4776]: W1208 09:01:10.248630 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-4e54ccc07a4f61e8c342a9af9d0ff520a2abda69f9c6edb38893ac31f107bacb WatchSource:0}: Error finding container 4e54ccc07a4f61e8c342a9af9d0ff520a2abda69f9c6edb38893ac31f107bacb: Status 404 returned error can't find the container with id 4e54ccc07a4f61e8c342a9af9d0ff520a2abda69f9c6edb38893ac31f107bacb Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.259983 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.271272 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.299644 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.321384 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.340048 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.359699 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.369625 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e296093f-3360-4db5-a967-b61c3a5cee51-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6zp8l\" (UID: \"e296093f-3360-4db5-a967-b61c3a5cee51\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.369655 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.369695 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.369711 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.369727 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2794h\" (UniqueName: \"kubernetes.io/projected/e296093f-3360-4db5-a967-b61c3a5cee51-kube-api-access-2794h\") pod \"cluster-image-registry-operator-dc59b4c8b-6zp8l\" (UID: \"e296093f-3360-4db5-a967-b61c3a5cee51\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.369744 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v627z\" (UniqueName: \"kubernetes.io/projected/10462781-68cf-4a10-b7c7-b9700465d964-kube-api-access-v627z\") pod \"cluster-samples-operator-665b6dd947-59m6v\" (UID: \"10462781-68cf-4a10-b7c7-b9700465d964\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.369773 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/332b83f9-1f6d-4563-9be3-96003033621d-auth-proxy-config\") pod \"machine-approver-56656f9798-44sjg\" (UID: \"332b83f9-1f6d-4563-9be3-96003033621d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.369881 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-registry-certificates\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.369911 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.369949 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e296093f-3360-4db5-a967-b61c3a5cee51-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6zp8l\" (UID: \"e296093f-3360-4db5-a967-b61c3a5cee51\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.369973 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-trusted-ca\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.369987 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370003 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26659\" (UniqueName: \"kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-kube-api-access-26659\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370018 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: E1208 09:01:10.370044 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:10.870023905 +0000 UTC m=+147.133249007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370135 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c6fbdd6-0243-4372-a986-cc73d2df8a74-audit-dir\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370165 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370239 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b8fd850-14eb-419b-a1d4-e7de203c419f-trusted-ca\") pod \"console-operator-58897d9998-dndwl\" (UID: \"6b8fd850-14eb-419b-a1d4-e7de203c419f\") " pod="openshift-console-operator/console-operator-58897d9998-dndwl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370286 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370312 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b8fd850-14eb-419b-a1d4-e7de203c419f-config\") pod \"console-operator-58897d9998-dndwl\" (UID: \"6b8fd850-14eb-419b-a1d4-e7de203c419f\") " pod="openshift-console-operator/console-operator-58897d9998-dndwl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370335 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e296093f-3360-4db5-a967-b61c3a5cee51-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6zp8l\" (UID: \"e296093f-3360-4db5-a967-b61c3a5cee51\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370379 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-registry-tls\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370409 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370436 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/332b83f9-1f6d-4563-9be3-96003033621d-machine-approver-tls\") pod \"machine-approver-56656f9798-44sjg\" (UID: \"332b83f9-1f6d-4563-9be3-96003033621d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370458 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pclzp\" (UniqueName: \"kubernetes.io/projected/332b83f9-1f6d-4563-9be3-96003033621d-kube-api-access-pclzp\") pod \"machine-approver-56656f9798-44sjg\" (UID: \"332b83f9-1f6d-4563-9be3-96003033621d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370529 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370584 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-bound-sa-token\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370605 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/332b83f9-1f6d-4563-9be3-96003033621d-config\") pod \"machine-approver-56656f9798-44sjg\" (UID: \"332b83f9-1f6d-4563-9be3-96003033621d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370650 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370671 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b8fd850-14eb-419b-a1d4-e7de203c419f-serving-cert\") pod \"console-operator-58897d9998-dndwl\" (UID: \"6b8fd850-14eb-419b-a1d4-e7de203c419f\") " pod="openshift-console-operator/console-operator-58897d9998-dndwl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370703 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2kmn\" (UniqueName: \"kubernetes.io/projected/6b8fd850-14eb-419b-a1d4-e7de203c419f-kube-api-access-x2kmn\") pod \"console-operator-58897d9998-dndwl\" (UID: \"6b8fd850-14eb-419b-a1d4-e7de203c419f\") " pod="openshift-console-operator/console-operator-58897d9998-dndwl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370756 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370831 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-audit-policies\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370858 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370899 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh4fd\" (UniqueName: \"kubernetes.io/projected/7c6fbdd6-0243-4372-a986-cc73d2df8a74-kube-api-access-rh4fd\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370921 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10462781-68cf-4a10-b7c7-b9700465d964-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-59m6v\" (UID: \"10462781-68cf-4a10-b7c7-b9700465d964\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.370987 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.400135 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.419314 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 08 09:01:10 crc kubenswrapper[4776]: W1208 09:01:10.431322 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-6f372228dcfd6c84b3c27f5c40ab09e20d1a68cb8ab6796849259d2797e877c6 WatchSource:0}: Error finding container 6f372228dcfd6c84b3c27f5c40ab09e20d1a68cb8ab6796849259d2797e877c6: Status 404 returned error can't find the container with id 6f372228dcfd6c84b3c27f5c40ab09e20d1a68cb8ab6796849259d2797e877c6 Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.440541 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.459368 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472075 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472288 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9213c07d-c865-4fde-b2cd-f28d46033e74-apiservice-cert\") pod \"packageserver-d55dfcdfc-znqkr\" (UID: \"9213c07d-c865-4fde-b2cd-f28d46033e74\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472322 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-bound-sa-token\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472345 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/332b83f9-1f6d-4563-9be3-96003033621d-config\") pod \"machine-approver-56656f9798-44sjg\" (UID: \"332b83f9-1f6d-4563-9be3-96003033621d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472365 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6825925-568b-416e-910f-52e1ee27d741-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-g4tmm\" (UID: \"b6825925-568b-416e-910f-52e1ee27d741\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm" Dec 08 09:01:10 crc kubenswrapper[4776]: E1208 09:01:10.472409 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:10.972385051 +0000 UTC m=+147.235610073 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472457 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nht9\" (UniqueName: \"kubernetes.io/projected/c736576f-aae9-4d51-a058-f4ad4d95edb4-kube-api-access-2nht9\") pod \"multus-admission-controller-857f4d67dd-szjk8\" (UID: \"c736576f-aae9-4d51-a058-f4ad4d95edb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-szjk8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472501 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b8fd850-14eb-419b-a1d4-e7de203c419f-serving-cert\") pod \"console-operator-58897d9998-dndwl\" (UID: \"6b8fd850-14eb-419b-a1d4-e7de203c419f\") " pod="openshift-console-operator/console-operator-58897d9998-dndwl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472521 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2kmn\" (UniqueName: \"kubernetes.io/projected/6b8fd850-14eb-419b-a1d4-e7de203c419f-kube-api-access-x2kmn\") pod \"console-operator-58897d9998-dndwl\" (UID: \"6b8fd850-14eb-419b-a1d4-e7de203c419f\") " pod="openshift-console-operator/console-operator-58897d9998-dndwl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472542 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39c1c6c8-3cb7-44eb-b0cc-f218e4dfe724-metrics-tls\") pod \"dns-operator-744455d44c-2jxng\" (UID: \"39c1c6c8-3cb7-44eb-b0cc-f218e4dfe724\") " pod="openshift-dns-operator/dns-operator-744455d44c-2jxng" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472563 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rg8m\" (UniqueName: \"kubernetes.io/projected/06cf2358-4cba-4d69-81d1-dc02434fe460-kube-api-access-6rg8m\") pod \"collect-profiles-29419740-xnvv4\" (UID: \"06cf2358-4cba-4d69-81d1-dc02434fe460\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472603 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45-bound-sa-token\") pod \"ingress-operator-5b745b69d9-94rjd\" (UID: \"5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472621 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpp6m\" (UniqueName: \"kubernetes.io/projected/50e1cbc5-727f-42ca-881c-fdd0b07ca739-kube-api-access-rpp6m\") pod \"marketplace-operator-79b997595-jf2nl\" (UID: \"50e1cbc5-727f-42ca-881c-fdd0b07ca739\") " pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472638 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndlzk\" (UniqueName: \"kubernetes.io/projected/5c4d8f0e-5e42-4fd1-8c31-2cd3ffd7eeda-kube-api-access-ndlzk\") pod \"ingress-canary-b59h4\" (UID: \"5c4d8f0e-5e42-4fd1-8c31-2cd3ffd7eeda\") " pod="openshift-ingress-canary/ingress-canary-b59h4" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472655 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn7kf\" (UniqueName: \"kubernetes.io/projected/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-kube-api-access-gn7kf\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472820 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r74b\" (UniqueName: \"kubernetes.io/projected/d519f527-d5bd-4e76-98de-4ff1e5720698-kube-api-access-9r74b\") pod \"service-ca-operator-777779d784-hzscb\" (UID: \"d519f527-d5bd-4e76-98de-4ff1e5720698\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hzscb" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472852 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0478469-3de4-4f1e-8853-c5e4cdfedef0-config-volume\") pod \"dns-default-dzfmd\" (UID: \"e0478469-3de4-4f1e-8853-c5e4cdfedef0\") " pod="openshift-dns/dns-default-dzfmd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472876 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16448b50-7f70-4571-8150-a462b3774dfc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-252w2\" (UID: \"16448b50-7f70-4571-8150-a462b3774dfc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472901 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4c9dc880-5c96-4a70-baa5-f4628a9a19be-signing-cabundle\") pod \"service-ca-9c57cc56f-5gpv8\" (UID: \"4c9dc880-5c96-4a70-baa5-f4628a9a19be\") " pod="openshift-service-ca/service-ca-9c57cc56f-5gpv8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472927 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b292f39-b8ff-4187-ae0a-14928dc18e08-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ds8zv\" (UID: \"2b292f39-b8ff-4187-ae0a-14928dc18e08\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472957 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-serving-cert\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.472983 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-audit-policies\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473006 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1a02f65b-9fb4-41a5-974a-648ba0e107eb-etcd-ca\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473032 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh4fd\" (UniqueName: \"kubernetes.io/projected/7c6fbdd6-0243-4372-a986-cc73d2df8a74-kube-api-access-rh4fd\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473056 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10462781-68cf-4a10-b7c7-b9700465d964-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-59m6v\" (UID: \"10462781-68cf-4a10-b7c7-b9700465d964\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473085 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2441969d-cfa3-4842-aeee-54625353b7bf-certs\") pod \"machine-config-server-pgq6g\" (UID: \"2441969d-cfa3-4842-aeee-54625353b7bf\") " pod="openshift-machine-config-operator/machine-config-server-pgq6g" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473113 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e060d3c0-cba4-4930-a3af-76f7c3f5c9c1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-b9c44\" (UID: \"e060d3c0-cba4-4930-a3af-76f7c3f5c9c1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473135 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs9k6\" (UniqueName: \"kubernetes.io/projected/e060d3c0-cba4-4930-a3af-76f7c3f5c9c1-kube-api-access-hs9k6\") pod \"olm-operator-6b444d44fb-b9c44\" (UID: \"e060d3c0-cba4-4930-a3af-76f7c3f5c9c1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473165 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c4c5b58-d542-4ab5-8132-586b180392a0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtrng\" (UID: \"6c4c5b58-d542-4ab5-8132-586b180392a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473208 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/50e1cbc5-727f-42ca-881c-fdd0b07ca739-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jf2nl\" (UID: \"50e1cbc5-727f-42ca-881c-fdd0b07ca739\") " pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473247 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473276 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncgqv\" (UniqueName: \"kubernetes.io/projected/e0478469-3de4-4f1e-8853-c5e4cdfedef0-kube-api-access-ncgqv\") pod \"dns-default-dzfmd\" (UID: \"e0478469-3de4-4f1e-8853-c5e4cdfedef0\") " pod="openshift-dns/dns-default-dzfmd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473316 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e296093f-3360-4db5-a967-b61c3a5cee51-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6zp8l\" (UID: \"e296093f-3360-4db5-a967-b61c3a5cee51\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473340 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/929dd3e5-a329-4291-8fd1-c998483026cc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tf65w\" (UID: \"929dd3e5-a329-4291-8fd1-c998483026cc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473364 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473389 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a02f65b-9fb4-41a5-974a-648ba0e107eb-serving-cert\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473409 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4c9dc880-5c96-4a70-baa5-f4628a9a19be-signing-key\") pod \"service-ca-9c57cc56f-5gpv8\" (UID: \"4c9dc880-5c96-4a70-baa5-f4628a9a19be\") " pod="openshift-service-ca/service-ca-9c57cc56f-5gpv8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473417 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/332b83f9-1f6d-4563-9be3-96003033621d-config\") pod \"machine-approver-56656f9798-44sjg\" (UID: \"332b83f9-1f6d-4563-9be3-96003033621d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473430 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7tm7\" (UniqueName: \"kubernetes.io/projected/4c9dc880-5c96-4a70-baa5-f4628a9a19be-kube-api-access-g7tm7\") pod \"service-ca-9c57cc56f-5gpv8\" (UID: \"4c9dc880-5c96-4a70-baa5-f4628a9a19be\") " pod="openshift-service-ca/service-ca-9c57cc56f-5gpv8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473470 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/69d10b7f-1714-4536-800b-e7aa5bc7b73a-stats-auth\") pod \"router-default-5444994796-xh8d5\" (UID: \"69d10b7f-1714-4536-800b-e7aa5bc7b73a\") " pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473516 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d519f527-d5bd-4e76-98de-4ff1e5720698-serving-cert\") pod \"service-ca-operator-777779d784-hzscb\" (UID: \"d519f527-d5bd-4e76-98de-4ff1e5720698\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hzscb" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473541 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcl6r\" (UniqueName: \"kubernetes.io/projected/69d10b7f-1714-4536-800b-e7aa5bc7b73a-kube-api-access-mcl6r\") pod \"router-default-5444994796-xh8d5\" (UID: \"69d10b7f-1714-4536-800b-e7aa5bc7b73a\") " pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473591 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473614 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473635 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2794h\" (UniqueName: \"kubernetes.io/projected/e296093f-3360-4db5-a967-b61c3a5cee51-kube-api-access-2794h\") pod \"cluster-image-registry-operator-dc59b4c8b-6zp8l\" (UID: \"e296093f-3360-4db5-a967-b61c3a5cee51\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473774 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-audit-policies\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: E1208 09:01:10.473843 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:10.97383563 +0000 UTC m=+147.237060652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.473982 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v627z\" (UniqueName: \"kubernetes.io/projected/10462781-68cf-4a10-b7c7-b9700465d964-kube-api-access-v627z\") pod \"cluster-samples-operator-665b6dd947-59m6v\" (UID: \"10462781-68cf-4a10-b7c7-b9700465d964\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474045 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxgpx\" (UniqueName: \"kubernetes.io/projected/9213c07d-c865-4fde-b2cd-f28d46033e74-kube-api-access-zxgpx\") pod \"packageserver-d55dfcdfc-znqkr\" (UID: \"9213c07d-c865-4fde-b2cd-f28d46033e74\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474199 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-etcd-client\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474319 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0478469-3de4-4f1e-8853-c5e4cdfedef0-metrics-tls\") pod \"dns-default-dzfmd\" (UID: \"e0478469-3de4-4f1e-8853-c5e4cdfedef0\") " pod="openshift-dns/dns-default-dzfmd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474442 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-trusted-ca\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474569 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e296093f-3360-4db5-a967-b61c3a5cee51-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6zp8l\" (UID: \"e296093f-3360-4db5-a967-b61c3a5cee51\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474596 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhx74\" (UniqueName: \"kubernetes.io/projected/cde69387-7c72-4285-a5ec-79f5626eeb96-kube-api-access-hhx74\") pod \"package-server-manager-789f6589d5-6z2v5\" (UID: \"cde69387-7c72-4285-a5ec-79f5626eeb96\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474624 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b292f39-b8ff-4187-ae0a-14928dc18e08-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ds8zv\" (UID: \"2b292f39-b8ff-4187-ae0a-14928dc18e08\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474645 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e060d3c0-cba4-4930-a3af-76f7c3f5c9c1-srv-cert\") pod \"olm-operator-6b444d44fb-b9c44\" (UID: \"e060d3c0-cba4-4930-a3af-76f7c3f5c9c1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474677 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-registration-dir\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474693 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/13284ed8-fd88-4a11-81e5-b58cf9551c27-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bwbkt\" (UID: \"13284ed8-fd88-4a11-81e5-b58cf9551c27\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474713 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9585\" (UniqueName: \"kubernetes.io/projected/6c4c5b58-d542-4ab5-8132-586b180392a0-kube-api-access-z9585\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtrng\" (UID: \"6c4c5b58-d542-4ab5-8132-586b180392a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474734 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929dd3e5-a329-4291-8fd1-c998483026cc-config\") pod \"kube-apiserver-operator-766d6c64bb-tf65w\" (UID: \"929dd3e5-a329-4291-8fd1-c998483026cc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474770 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c6fbdd6-0243-4372-a986-cc73d2df8a74-audit-dir\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474787 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474812 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b8fd850-14eb-419b-a1d4-e7de203c419f-trusted-ca\") pod \"console-operator-58897d9998-dndwl\" (UID: \"6b8fd850-14eb-419b-a1d4-e7de203c419f\") " pod="openshift-console-operator/console-operator-58897d9998-dndwl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474833 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjdx7\" (UniqueName: \"kubernetes.io/projected/deb5eac5-1137-42a3-a5ad-e52a2d822cba-kube-api-access-cjdx7\") pod \"migrator-59844c95c7-ckhmx\" (UID: \"deb5eac5-1137-42a3-a5ad-e52a2d822cba\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckhmx" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474851 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbfvz\" (UniqueName: \"kubernetes.io/projected/35911247-ad00-422c-9d30-586834a80f76-kube-api-access-fbfvz\") pod \"control-plane-machine-set-operator-78cbb6b69f-njv2h\" (UID: \"35911247-ad00-422c-9d30-586834a80f76\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-njv2h" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474883 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c6fbdd6-0243-4372-a986-cc73d2df8a74-audit-dir\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.474890 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b8fd850-14eb-419b-a1d4-e7de203c419f-config\") pod \"console-operator-58897d9998-dndwl\" (UID: \"6b8fd850-14eb-419b-a1d4-e7de203c419f\") " pod="openshift-console-operator/console-operator-58897d9998-dndwl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.475136 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e296093f-3360-4db5-a967-b61c3a5cee51-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6zp8l\" (UID: \"e296093f-3360-4db5-a967-b61c3a5cee51\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.475195 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b292f39-b8ff-4187-ae0a-14928dc18e08-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ds8zv\" (UID: \"2b292f39-b8ff-4187-ae0a-14928dc18e08\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.475227 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pclzp\" (UniqueName: \"kubernetes.io/projected/332b83f9-1f6d-4563-9be3-96003033621d-kube-api-access-pclzp\") pod \"machine-approver-56656f9798-44sjg\" (UID: \"332b83f9-1f6d-4563-9be3-96003033621d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.475260 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c1b1a4bb-5729-4495-a10f-d5aa62f2c502-profile-collector-cert\") pod \"catalog-operator-68c6474976-t55fv\" (UID: \"c1b1a4bb-5729-4495-a10f-d5aa62f2c502\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.475286 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4c5b58-d542-4ab5-8132-586b180392a0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtrng\" (UID: \"6c4c5b58-d542-4ab5-8132-586b180392a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.475320 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0da3b83e-efc3-4e6d-b876-186f430d3d77-serving-cert\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.475349 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-registry-tls\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.475373 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.475400 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/332b83f9-1f6d-4563-9be3-96003033621d-machine-approver-tls\") pod \"machine-approver-56656f9798-44sjg\" (UID: \"332b83f9-1f6d-4563-9be3-96003033621d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.475597 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b8fd850-14eb-419b-a1d4-e7de203c419f-config\") pod \"console-operator-58897d9998-dndwl\" (UID: \"6b8fd850-14eb-419b-a1d4-e7de203c419f\") " pod="openshift-console-operator/console-operator-58897d9998-dndwl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.476011 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-trusted-ca\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.476339 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b8fd850-14eb-419b-a1d4-e7de203c419f-trusted-ca\") pod \"console-operator-58897d9998-dndwl\" (UID: \"6b8fd850-14eb-419b-a1d4-e7de203c419f\") " pod="openshift-console-operator/console-operator-58897d9998-dndwl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.476648 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.476749 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16448b50-7f70-4571-8150-a462b3774dfc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-252w2\" (UID: \"16448b50-7f70-4571-8150-a462b3774dfc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.476827 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-plugins-dir\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.476883 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-encryption-config\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.476980 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-csi-data-dir\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477031 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477063 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29tgx\" (UniqueName: \"kubernetes.io/projected/5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45-kube-api-access-29tgx\") pod \"ingress-operator-5b745b69d9-94rjd\" (UID: \"5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477104 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-config\") pod \"route-controller-manager-6576b87f9c-9ll9b\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477145 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477191 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2fv9\" (UniqueName: \"kubernetes.io/projected/13284ed8-fd88-4a11-81e5-b58cf9551c27-kube-api-access-w2fv9\") pod \"machine-config-operator-74547568cd-bwbkt\" (UID: \"13284ed8-fd88-4a11-81e5-b58cf9551c27\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477245 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-client-ca\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477272 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9213c07d-c865-4fde-b2cd-f28d46033e74-tmpfs\") pod \"packageserver-d55dfcdfc-znqkr\" (UID: \"9213c07d-c865-4fde-b2cd-f28d46033e74\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477297 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gclh\" (UniqueName: \"kubernetes.io/projected/39c1c6c8-3cb7-44eb-b0cc-f218e4dfe724-kube-api-access-9gclh\") pod \"dns-operator-744455d44c-2jxng\" (UID: \"39c1c6c8-3cb7-44eb-b0cc-f218e4dfe724\") " pod="openshift-dns-operator/dns-operator-744455d44c-2jxng" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477326 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477352 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16448b50-7f70-4571-8150-a462b3774dfc-config\") pod \"kube-controller-manager-operator-78b949d7b-252w2\" (UID: \"16448b50-7f70-4571-8150-a462b3774dfc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477374 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1a02f65b-9fb4-41a5-974a-648ba0e107eb-etcd-client\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477395 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08ac05cf-c12d-4898-a9b0-451f11b44aed-proxy-tls\") pod \"machine-config-controller-84d6567774-srht8\" (UID: \"08ac05cf-c12d-4898-a9b0-451f11b44aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-srht8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477419 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9213c07d-c865-4fde-b2cd-f28d46033e74-webhook-cert\") pod \"packageserver-d55dfcdfc-znqkr\" (UID: \"9213c07d-c865-4fde-b2cd-f28d46033e74\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477438 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477457 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45-trusted-ca\") pod \"ingress-operator-5b745b69d9-94rjd\" (UID: \"5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477496 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a02f65b-9fb4-41a5-974a-648ba0e107eb-config\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477528 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69d10b7f-1714-4536-800b-e7aa5bc7b73a-metrics-certs\") pod \"router-default-5444994796-xh8d5\" (UID: \"69d10b7f-1714-4536-800b-e7aa5bc7b73a\") " pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477643 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/35911247-ad00-422c-9d30-586834a80f76-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-njv2h\" (UID: \"35911247-ad00-422c-9d30-586834a80f76\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-njv2h" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477690 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-audit\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477717 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06cf2358-4cba-4d69-81d1-dc02434fe460-secret-volume\") pod \"collect-profiles-29419740-xnvv4\" (UID: \"06cf2358-4cba-4d69-81d1-dc02434fe460\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477741 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69d10b7f-1714-4536-800b-e7aa5bc7b73a-service-ca-bundle\") pod \"router-default-5444994796-xh8d5\" (UID: \"69d10b7f-1714-4536-800b-e7aa5bc7b73a\") " pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477777 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c1b1a4bb-5729-4495-a10f-d5aa62f2c502-srv-cert\") pod \"catalog-operator-68c6474976-t55fv\" (UID: \"c1b1a4bb-5729-4495-a10f-d5aa62f2c502\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477798 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c736576f-aae9-4d51-a058-f4ad4d95edb4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-szjk8\" (UID: \"c736576f-aae9-4d51-a058-f4ad4d95edb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-szjk8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477820 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a02f65b-9fb4-41a5-974a-648ba0e107eb-etcd-service-ca\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477885 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50e1cbc5-727f-42ca-881c-fdd0b07ca739-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jf2nl\" (UID: \"50e1cbc5-727f-42ca-881c-fdd0b07ca739\") " pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477922 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5qdw\" (UniqueName: \"kubernetes.io/projected/1a02f65b-9fb4-41a5-974a-648ba0e107eb-kube-api-access-w5qdw\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477946 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/929dd3e5-a329-4291-8fd1-c998483026cc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tf65w\" (UID: \"929dd3e5-a329-4291-8fd1-c998483026cc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.477980 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45-metrics-tls\") pod \"ingress-operator-5b745b69d9-94rjd\" (UID: \"5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.478003 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfgd4\" (UniqueName: \"kubernetes.io/projected/b6825925-568b-416e-910f-52e1ee27d741-kube-api-access-dfgd4\") pod \"kube-storage-version-migrator-operator-b67b599dd-g4tmm\" (UID: \"b6825925-568b-416e-910f-52e1ee27d741\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.478042 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6825925-568b-416e-910f-52e1ee27d741-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-g4tmm\" (UID: \"b6825925-568b-416e-910f-52e1ee27d741\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.478069 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-client-ca\") pod \"route-controller-manager-6576b87f9c-9ll9b\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.478319 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10462781-68cf-4a10-b7c7-b9700465d964-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-59m6v\" (UID: \"10462781-68cf-4a10-b7c7-b9700465d964\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.478847 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479137 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13284ed8-fd88-4a11-81e5-b58cf9551c27-proxy-tls\") pod \"machine-config-operator-74547568cd-bwbkt\" (UID: \"13284ed8-fd88-4a11-81e5-b58cf9551c27\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479192 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c4d8f0e-5e42-4fd1-8c31-2cd3ffd7eeda-cert\") pod \"ingress-canary-b59h4\" (UID: \"5c4d8f0e-5e42-4fd1-8c31-2cd3ffd7eeda\") " pod="openshift-ingress-canary/ingress-canary-b59h4" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479262 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/332b83f9-1f6d-4563-9be3-96003033621d-auth-proxy-config\") pod \"machine-approver-56656f9798-44sjg\" (UID: \"332b83f9-1f6d-4563-9be3-96003033621d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479269 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479290 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cde69387-7c72-4285-a5ec-79f5626eeb96-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6z2v5\" (UID: \"cde69387-7c72-4285-a5ec-79f5626eeb96\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479320 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-registry-certificates\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479345 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479414 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e296093f-3360-4db5-a967-b61c3a5cee51-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6zp8l\" (UID: \"e296093f-3360-4db5-a967-b61c3a5cee51\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479445 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479469 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26659\" (UniqueName: \"kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-kube-api-access-26659\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479534 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479562 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vjf8\" (UniqueName: \"kubernetes.io/projected/c1b1a4bb-5729-4495-a10f-d5aa62f2c502-kube-api-access-8vjf8\") pod \"catalog-operator-68c6474976-t55fv\" (UID: \"c1b1a4bb-5729-4495-a10f-d5aa62f2c502\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479586 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4mhj\" (UniqueName: \"kubernetes.io/projected/2441969d-cfa3-4842-aeee-54625353b7bf-kube-api-access-d4mhj\") pod \"machine-config-server-pgq6g\" (UID: \"2441969d-cfa3-4842-aeee-54625353b7bf\") " pod="openshift-machine-config-operator/machine-config-server-pgq6g" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479782 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-socket-dir\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479792 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479938 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479809 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.480001 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.480034 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06cf2358-4cba-4d69-81d1-dc02434fe460-config-volume\") pod \"collect-profiles-29419740-xnvv4\" (UID: \"06cf2358-4cba-4d69-81d1-dc02434fe460\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.480065 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d519f527-d5bd-4e76-98de-4ff1e5720698-config\") pod \"service-ca-operator-777779d784-hzscb\" (UID: \"d519f527-d5bd-4e76-98de-4ff1e5720698\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hzscb" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.480088 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-mountpoint-dir\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.479966 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/332b83f9-1f6d-4563-9be3-96003033621d-auth-proxy-config\") pod \"machine-approver-56656f9798-44sjg\" (UID: \"332b83f9-1f6d-4563-9be3-96003033621d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.480155 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.480201 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2441969d-cfa3-4842-aeee-54625353b7bf-node-bootstrap-token\") pod \"machine-config-server-pgq6g\" (UID: \"2441969d-cfa3-4842-aeee-54625353b7bf\") " pod="openshift-machine-config-operator/machine-config-server-pgq6g" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.480226 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/13284ed8-fd88-4a11-81e5-b58cf9551c27-images\") pod \"machine-config-operator-74547568cd-bwbkt\" (UID: \"13284ed8-fd88-4a11-81e5-b58cf9551c27\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.480259 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/08ac05cf-c12d-4898-a9b0-451f11b44aed-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-srht8\" (UID: \"08ac05cf-c12d-4898-a9b0-451f11b44aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-srht8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.480282 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r5qp\" (UniqueName: \"kubernetes.io/projected/08ac05cf-c12d-4898-a9b0-451f11b44aed-kube-api-access-6r5qp\") pod \"machine-config-controller-84d6567774-srht8\" (UID: \"08ac05cf-c12d-4898-a9b0-451f11b44aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-srht8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.480355 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/69d10b7f-1714-4536-800b-e7aa5bc7b73a-default-certificate\") pod \"router-default-5444994796-xh8d5\" (UID: \"69d10b7f-1714-4536-800b-e7aa5bc7b73a\") " pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.480633 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.480870 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-registry-certificates\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.481002 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.481274 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.481608 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.481998 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.482106 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-registry-tls\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.482110 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.482405 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.483293 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.484371 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/332b83f9-1f6d-4563-9be3-96003033621d-machine-approver-tls\") pod \"machine-approver-56656f9798-44sjg\" (UID: \"332b83f9-1f6d-4563-9be3-96003033621d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.484826 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.485333 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e296093f-3360-4db5-a967-b61c3a5cee51-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6zp8l\" (UID: \"e296093f-3360-4db5-a967-b61c3a5cee51\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.500329 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6f372228dcfd6c84b3c27f5c40ab09e20d1a68cb8ab6796849259d2797e877c6"} Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.501507 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"51ff08bac7fd20fad705de844b233c9f17bead0e9f07e46918bc59631144d8d8"} Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.501550 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4e54ccc07a4f61e8c342a9af9d0ff520a2abda69f9c6edb38893ac31f107bacb"} Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.501738 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.503409 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d884021ac9262d842c1c46109d354892e6653aa5ce7d7eaca06b6785a6aa944d"} Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.503434 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ab23bae57de01a6796e75443568da2b65f55a21df7c11bbd65393b50a33be379"} Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.517052 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2wx9\" (UniqueName: \"kubernetes.io/projected/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-kube-api-access-x2wx9\") pod \"console-f9d7485db-8dm9l\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.576996 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzlzj\" (UniqueName: \"kubernetes.io/projected/f6da64f0-d985-46de-bffa-4ae9632c0245-kube-api-access-rzlzj\") pod \"openshift-config-operator-7777fb866f-txnxn\" (UID: \"f6da64f0-d985-46de-bffa-4ae9632c0245\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.581378 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:10 crc kubenswrapper[4776]: E1208 09:01:10.581511 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.081491877 +0000 UTC m=+147.344716899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.581601 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06cf2358-4cba-4d69-81d1-dc02434fe460-config-volume\") pod \"collect-profiles-29419740-xnvv4\" (UID: \"06cf2358-4cba-4d69-81d1-dc02434fe460\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.581639 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-socket-dir\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.581695 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d519f527-d5bd-4e76-98de-4ff1e5720698-config\") pod \"service-ca-operator-777779d784-hzscb\" (UID: \"d519f527-d5bd-4e76-98de-4ff1e5720698\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hzscb" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.581723 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-mountpoint-dir\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.582144 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2441969d-cfa3-4842-aeee-54625353b7bf-node-bootstrap-token\") pod \"machine-config-server-pgq6g\" (UID: \"2441969d-cfa3-4842-aeee-54625353b7bf\") " pod="openshift-machine-config-operator/machine-config-server-pgq6g" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.582033 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-socket-dir\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.582302 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06cf2358-4cba-4d69-81d1-dc02434fe460-config-volume\") pod \"collect-profiles-29419740-xnvv4\" (UID: \"06cf2358-4cba-4d69-81d1-dc02434fe460\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.582322 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/13284ed8-fd88-4a11-81e5-b58cf9551c27-images\") pod \"machine-config-operator-74547568cd-bwbkt\" (UID: \"13284ed8-fd88-4a11-81e5-b58cf9551c27\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.582076 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-mountpoint-dir\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.583040 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d519f527-d5bd-4e76-98de-4ff1e5720698-config\") pod \"service-ca-operator-777779d784-hzscb\" (UID: \"d519f527-d5bd-4e76-98de-4ff1e5720698\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hzscb" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.583217 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/08ac05cf-c12d-4898-a9b0-451f11b44aed-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-srht8\" (UID: \"08ac05cf-c12d-4898-a9b0-451f11b44aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-srht8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.583261 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r5qp\" (UniqueName: \"kubernetes.io/projected/08ac05cf-c12d-4898-a9b0-451f11b44aed-kube-api-access-6r5qp\") pod \"machine-config-controller-84d6567774-srht8\" (UID: \"08ac05cf-c12d-4898-a9b0-451f11b44aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-srht8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.583281 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/69d10b7f-1714-4536-800b-e7aa5bc7b73a-default-certificate\") pod \"router-default-5444994796-xh8d5\" (UID: \"69d10b7f-1714-4536-800b-e7aa5bc7b73a\") " pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.583304 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9213c07d-c865-4fde-b2cd-f28d46033e74-apiservice-cert\") pod \"packageserver-d55dfcdfc-znqkr\" (UID: \"9213c07d-c865-4fde-b2cd-f28d46033e74\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.583316 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/13284ed8-fd88-4a11-81e5-b58cf9551c27-images\") pod \"machine-config-operator-74547568cd-bwbkt\" (UID: \"13284ed8-fd88-4a11-81e5-b58cf9551c27\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.583322 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6825925-568b-416e-910f-52e1ee27d741-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-g4tmm\" (UID: \"b6825925-568b-416e-910f-52e1ee27d741\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.583965 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nht9\" (UniqueName: \"kubernetes.io/projected/c736576f-aae9-4d51-a058-f4ad4d95edb4-kube-api-access-2nht9\") pod \"multus-admission-controller-857f4d67dd-szjk8\" (UID: \"c736576f-aae9-4d51-a058-f4ad4d95edb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-szjk8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584000 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39c1c6c8-3cb7-44eb-b0cc-f218e4dfe724-metrics-tls\") pod \"dns-operator-744455d44c-2jxng\" (UID: \"39c1c6c8-3cb7-44eb-b0cc-f218e4dfe724\") " pod="openshift-dns-operator/dns-operator-744455d44c-2jxng" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584022 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45-bound-sa-token\") pod \"ingress-operator-5b745b69d9-94rjd\" (UID: \"5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584045 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rg8m\" (UniqueName: \"kubernetes.io/projected/06cf2358-4cba-4d69-81d1-dc02434fe460-kube-api-access-6rg8m\") pod \"collect-profiles-29419740-xnvv4\" (UID: \"06cf2358-4cba-4d69-81d1-dc02434fe460\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584061 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndlzk\" (UniqueName: \"kubernetes.io/projected/5c4d8f0e-5e42-4fd1-8c31-2cd3ffd7eeda-kube-api-access-ndlzk\") pod \"ingress-canary-b59h4\" (UID: \"5c4d8f0e-5e42-4fd1-8c31-2cd3ffd7eeda\") " pod="openshift-ingress-canary/ingress-canary-b59h4" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584077 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn7kf\" (UniqueName: \"kubernetes.io/projected/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-kube-api-access-gn7kf\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584101 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r74b\" (UniqueName: \"kubernetes.io/projected/d519f527-d5bd-4e76-98de-4ff1e5720698-kube-api-access-9r74b\") pod \"service-ca-operator-777779d784-hzscb\" (UID: \"d519f527-d5bd-4e76-98de-4ff1e5720698\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hzscb" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584116 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0478469-3de4-4f1e-8853-c5e4cdfedef0-config-volume\") pod \"dns-default-dzfmd\" (UID: \"e0478469-3de4-4f1e-8853-c5e4cdfedef0\") " pod="openshift-dns/dns-default-dzfmd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584133 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpp6m\" (UniqueName: \"kubernetes.io/projected/50e1cbc5-727f-42ca-881c-fdd0b07ca739-kube-api-access-rpp6m\") pod \"marketplace-operator-79b997595-jf2nl\" (UID: \"50e1cbc5-727f-42ca-881c-fdd0b07ca739\") " pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584241 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16448b50-7f70-4571-8150-a462b3774dfc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-252w2\" (UID: \"16448b50-7f70-4571-8150-a462b3774dfc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584275 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4c9dc880-5c96-4a70-baa5-f4628a9a19be-signing-cabundle\") pod \"service-ca-9c57cc56f-5gpv8\" (UID: \"4c9dc880-5c96-4a70-baa5-f4628a9a19be\") " pod="openshift-service-ca/service-ca-9c57cc56f-5gpv8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584301 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b292f39-b8ff-4187-ae0a-14928dc18e08-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ds8zv\" (UID: \"2b292f39-b8ff-4187-ae0a-14928dc18e08\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584325 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2441969d-cfa3-4842-aeee-54625353b7bf-certs\") pod \"machine-config-server-pgq6g\" (UID: \"2441969d-cfa3-4842-aeee-54625353b7bf\") " pod="openshift-machine-config-operator/machine-config-server-pgq6g" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584342 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1a02f65b-9fb4-41a5-974a-648ba0e107eb-etcd-ca\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584360 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e060d3c0-cba4-4930-a3af-76f7c3f5c9c1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-b9c44\" (UID: \"e060d3c0-cba4-4930-a3af-76f7c3f5c9c1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584378 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs9k6\" (UniqueName: \"kubernetes.io/projected/e060d3c0-cba4-4930-a3af-76f7c3f5c9c1-kube-api-access-hs9k6\") pod \"olm-operator-6b444d44fb-b9c44\" (UID: \"e060d3c0-cba4-4930-a3af-76f7c3f5c9c1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584399 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c4c5b58-d542-4ab5-8132-586b180392a0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtrng\" (UID: \"6c4c5b58-d542-4ab5-8132-586b180392a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584414 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/50e1cbc5-727f-42ca-881c-fdd0b07ca739-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jf2nl\" (UID: \"50e1cbc5-727f-42ca-881c-fdd0b07ca739\") " pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584432 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncgqv\" (UniqueName: \"kubernetes.io/projected/e0478469-3de4-4f1e-8853-c5e4cdfedef0-kube-api-access-ncgqv\") pod \"dns-default-dzfmd\" (UID: \"e0478469-3de4-4f1e-8853-c5e4cdfedef0\") " pod="openshift-dns/dns-default-dzfmd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584460 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/929dd3e5-a329-4291-8fd1-c998483026cc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tf65w\" (UID: \"929dd3e5-a329-4291-8fd1-c998483026cc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584476 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a02f65b-9fb4-41a5-974a-648ba0e107eb-serving-cert\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584490 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4c9dc880-5c96-4a70-baa5-f4628a9a19be-signing-key\") pod \"service-ca-9c57cc56f-5gpv8\" (UID: \"4c9dc880-5c96-4a70-baa5-f4628a9a19be\") " pod="openshift-service-ca/service-ca-9c57cc56f-5gpv8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584505 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7tm7\" (UniqueName: \"kubernetes.io/projected/4c9dc880-5c96-4a70-baa5-f4628a9a19be-kube-api-access-g7tm7\") pod \"service-ca-9c57cc56f-5gpv8\" (UID: \"4c9dc880-5c96-4a70-baa5-f4628a9a19be\") " pod="openshift-service-ca/service-ca-9c57cc56f-5gpv8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584522 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/69d10b7f-1714-4536-800b-e7aa5bc7b73a-stats-auth\") pod \"router-default-5444994796-xh8d5\" (UID: \"69d10b7f-1714-4536-800b-e7aa5bc7b73a\") " pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584549 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d519f527-d5bd-4e76-98de-4ff1e5720698-serving-cert\") pod \"service-ca-operator-777779d784-hzscb\" (UID: \"d519f527-d5bd-4e76-98de-4ff1e5720698\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hzscb" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584565 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcl6r\" (UniqueName: \"kubernetes.io/projected/69d10b7f-1714-4536-800b-e7aa5bc7b73a-kube-api-access-mcl6r\") pod \"router-default-5444994796-xh8d5\" (UID: \"69d10b7f-1714-4536-800b-e7aa5bc7b73a\") " pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584588 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584618 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxgpx\" (UniqueName: \"kubernetes.io/projected/9213c07d-c865-4fde-b2cd-f28d46033e74-kube-api-access-zxgpx\") pod \"packageserver-d55dfcdfc-znqkr\" (UID: \"9213c07d-c865-4fde-b2cd-f28d46033e74\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584640 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0478469-3de4-4f1e-8853-c5e4cdfedef0-metrics-tls\") pod \"dns-default-dzfmd\" (UID: \"e0478469-3de4-4f1e-8853-c5e4cdfedef0\") " pod="openshift-dns/dns-default-dzfmd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584662 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhx74\" (UniqueName: \"kubernetes.io/projected/cde69387-7c72-4285-a5ec-79f5626eeb96-kube-api-access-hhx74\") pod \"package-server-manager-789f6589d5-6z2v5\" (UID: \"cde69387-7c72-4285-a5ec-79f5626eeb96\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584677 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b292f39-b8ff-4187-ae0a-14928dc18e08-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ds8zv\" (UID: \"2b292f39-b8ff-4187-ae0a-14928dc18e08\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584693 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e060d3c0-cba4-4930-a3af-76f7c3f5c9c1-srv-cert\") pod \"olm-operator-6b444d44fb-b9c44\" (UID: \"e060d3c0-cba4-4930-a3af-76f7c3f5c9c1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584714 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-registration-dir\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584729 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9585\" (UniqueName: \"kubernetes.io/projected/6c4c5b58-d542-4ab5-8132-586b180392a0-kube-api-access-z9585\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtrng\" (UID: \"6c4c5b58-d542-4ab5-8132-586b180392a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585076 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929dd3e5-a329-4291-8fd1-c998483026cc-config\") pod \"kube-apiserver-operator-766d6c64bb-tf65w\" (UID: \"929dd3e5-a329-4291-8fd1-c998483026cc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585105 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/13284ed8-fd88-4a11-81e5-b58cf9551c27-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bwbkt\" (UID: \"13284ed8-fd88-4a11-81e5-b58cf9551c27\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585129 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjdx7\" (UniqueName: \"kubernetes.io/projected/deb5eac5-1137-42a3-a5ad-e52a2d822cba-kube-api-access-cjdx7\") pod \"migrator-59844c95c7-ckhmx\" (UID: \"deb5eac5-1137-42a3-a5ad-e52a2d822cba\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckhmx" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585147 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbfvz\" (UniqueName: \"kubernetes.io/projected/35911247-ad00-422c-9d30-586834a80f76-kube-api-access-fbfvz\") pod \"control-plane-machine-set-operator-78cbb6b69f-njv2h\" (UID: \"35911247-ad00-422c-9d30-586834a80f76\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-njv2h" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585186 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b292f39-b8ff-4187-ae0a-14928dc18e08-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ds8zv\" (UID: \"2b292f39-b8ff-4187-ae0a-14928dc18e08\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585203 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c1b1a4bb-5729-4495-a10f-d5aa62f2c502-profile-collector-cert\") pod \"catalog-operator-68c6474976-t55fv\" (UID: \"c1b1a4bb-5729-4495-a10f-d5aa62f2c502\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585220 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4c5b58-d542-4ab5-8132-586b180392a0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtrng\" (UID: \"6c4c5b58-d542-4ab5-8132-586b180392a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585268 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16448b50-7f70-4571-8150-a462b3774dfc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-252w2\" (UID: \"16448b50-7f70-4571-8150-a462b3774dfc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585301 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-plugins-dir\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585324 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-csi-data-dir\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585340 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29tgx\" (UniqueName: \"kubernetes.io/projected/5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45-kube-api-access-29tgx\") pod \"ingress-operator-5b745b69d9-94rjd\" (UID: \"5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585378 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9213c07d-c865-4fde-b2cd-f28d46033e74-tmpfs\") pod \"packageserver-d55dfcdfc-znqkr\" (UID: \"9213c07d-c865-4fde-b2cd-f28d46033e74\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585395 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gclh\" (UniqueName: \"kubernetes.io/projected/39c1c6c8-3cb7-44eb-b0cc-f218e4dfe724-kube-api-access-9gclh\") pod \"dns-operator-744455d44c-2jxng\" (UID: \"39c1c6c8-3cb7-44eb-b0cc-f218e4dfe724\") " pod="openshift-dns-operator/dns-operator-744455d44c-2jxng" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585413 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2fv9\" (UniqueName: \"kubernetes.io/projected/13284ed8-fd88-4a11-81e5-b58cf9551c27-kube-api-access-w2fv9\") pod \"machine-config-operator-74547568cd-bwbkt\" (UID: \"13284ed8-fd88-4a11-81e5-b58cf9551c27\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585431 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16448b50-7f70-4571-8150-a462b3774dfc-config\") pod \"kube-controller-manager-operator-78b949d7b-252w2\" (UID: \"16448b50-7f70-4571-8150-a462b3774dfc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585447 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1a02f65b-9fb4-41a5-974a-648ba0e107eb-etcd-client\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585461 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08ac05cf-c12d-4898-a9b0-451f11b44aed-proxy-tls\") pod \"machine-config-controller-84d6567774-srht8\" (UID: \"08ac05cf-c12d-4898-a9b0-451f11b44aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-srht8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585478 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9213c07d-c865-4fde-b2cd-f28d46033e74-webhook-cert\") pod \"packageserver-d55dfcdfc-znqkr\" (UID: \"9213c07d-c865-4fde-b2cd-f28d46033e74\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585496 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45-trusted-ca\") pod \"ingress-operator-5b745b69d9-94rjd\" (UID: \"5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585524 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a02f65b-9fb4-41a5-974a-648ba0e107eb-config\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585541 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69d10b7f-1714-4536-800b-e7aa5bc7b73a-metrics-certs\") pod \"router-default-5444994796-xh8d5\" (UID: \"69d10b7f-1714-4536-800b-e7aa5bc7b73a\") " pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585558 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/35911247-ad00-422c-9d30-586834a80f76-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-njv2h\" (UID: \"35911247-ad00-422c-9d30-586834a80f76\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-njv2h" Dec 08 09:01:10 crc kubenswrapper[4776]: E1208 09:01:10.585577 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.085561326 +0000 UTC m=+147.348786448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585625 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06cf2358-4cba-4d69-81d1-dc02434fe460-secret-volume\") pod \"collect-profiles-29419740-xnvv4\" (UID: \"06cf2358-4cba-4d69-81d1-dc02434fe460\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585652 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c1b1a4bb-5729-4495-a10f-d5aa62f2c502-srv-cert\") pod \"catalog-operator-68c6474976-t55fv\" (UID: \"c1b1a4bb-5729-4495-a10f-d5aa62f2c502\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585674 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c736576f-aae9-4d51-a058-f4ad4d95edb4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-szjk8\" (UID: \"c736576f-aae9-4d51-a058-f4ad4d95edb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-szjk8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585697 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a02f65b-9fb4-41a5-974a-648ba0e107eb-etcd-service-ca\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585719 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50e1cbc5-727f-42ca-881c-fdd0b07ca739-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jf2nl\" (UID: \"50e1cbc5-727f-42ca-881c-fdd0b07ca739\") " pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585742 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69d10b7f-1714-4536-800b-e7aa5bc7b73a-service-ca-bundle\") pod \"router-default-5444994796-xh8d5\" (UID: \"69d10b7f-1714-4536-800b-e7aa5bc7b73a\") " pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585776 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5qdw\" (UniqueName: \"kubernetes.io/projected/1a02f65b-9fb4-41a5-974a-648ba0e107eb-kube-api-access-w5qdw\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585800 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/929dd3e5-a329-4291-8fd1-c998483026cc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tf65w\" (UID: \"929dd3e5-a329-4291-8fd1-c998483026cc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585824 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45-metrics-tls\") pod \"ingress-operator-5b745b69d9-94rjd\" (UID: \"5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585847 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfgd4\" (UniqueName: \"kubernetes.io/projected/b6825925-568b-416e-910f-52e1ee27d741-kube-api-access-dfgd4\") pod \"kube-storage-version-migrator-operator-b67b599dd-g4tmm\" (UID: \"b6825925-568b-416e-910f-52e1ee27d741\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585869 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6825925-568b-416e-910f-52e1ee27d741-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-g4tmm\" (UID: \"b6825925-568b-416e-910f-52e1ee27d741\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585901 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13284ed8-fd88-4a11-81e5-b58cf9551c27-proxy-tls\") pod \"machine-config-operator-74547568cd-bwbkt\" (UID: \"13284ed8-fd88-4a11-81e5-b58cf9551c27\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585924 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c4d8f0e-5e42-4fd1-8c31-2cd3ffd7eeda-cert\") pod \"ingress-canary-b59h4\" (UID: \"5c4d8f0e-5e42-4fd1-8c31-2cd3ffd7eeda\") " pod="openshift-ingress-canary/ingress-canary-b59h4" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585954 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cde69387-7c72-4285-a5ec-79f5626eeb96-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6z2v5\" (UID: \"cde69387-7c72-4285-a5ec-79f5626eeb96\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585997 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vjf8\" (UniqueName: \"kubernetes.io/projected/c1b1a4bb-5729-4495-a10f-d5aa62f2c502-kube-api-access-8vjf8\") pod \"catalog-operator-68c6474976-t55fv\" (UID: \"c1b1a4bb-5729-4495-a10f-d5aa62f2c502\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.586023 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4mhj\" (UniqueName: \"kubernetes.io/projected/2441969d-cfa3-4842-aeee-54625353b7bf-kube-api-access-d4mhj\") pod \"machine-config-server-pgq6g\" (UID: \"2441969d-cfa3-4842-aeee-54625353b7bf\") " pod="openshift-machine-config-operator/machine-config-server-pgq6g" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.586056 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929dd3e5-a329-4291-8fd1-c998483026cc-config\") pod \"kube-apiserver-operator-766d6c64bb-tf65w\" (UID: \"929dd3e5-a329-4291-8fd1-c998483026cc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.586282 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1a02f65b-9fb4-41a5-974a-648ba0e107eb-etcd-ca\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.586672 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/69d10b7f-1714-4536-800b-e7aa5bc7b73a-default-certificate\") pod \"router-default-5444994796-xh8d5\" (UID: \"69d10b7f-1714-4536-800b-e7aa5bc7b73a\") " pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.586696 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/13284ed8-fd88-4a11-81e5-b58cf9551c27-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bwbkt\" (UID: \"13284ed8-fd88-4a11-81e5-b58cf9551c27\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.586987 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b292f39-b8ff-4187-ae0a-14928dc18e08-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ds8zv\" (UID: \"2b292f39-b8ff-4187-ae0a-14928dc18e08\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.587148 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/08ac05cf-c12d-4898-a9b0-451f11b44aed-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-srht8\" (UID: \"08ac05cf-c12d-4898-a9b0-451f11b44aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-srht8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.587243 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4c9dc880-5c96-4a70-baa5-f4628a9a19be-signing-cabundle\") pod \"service-ca-9c57cc56f-5gpv8\" (UID: \"4c9dc880-5c96-4a70-baa5-f4628a9a19be\") " pod="openshift-service-ca/service-ca-9c57cc56f-5gpv8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.588440 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16448b50-7f70-4571-8150-a462b3774dfc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-252w2\" (UID: \"16448b50-7f70-4571-8150-a462b3774dfc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.584136 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6825925-568b-416e-910f-52e1ee27d741-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-g4tmm\" (UID: \"b6825925-568b-416e-910f-52e1ee27d741\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.589039 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39c1c6c8-3cb7-44eb-b0cc-f218e4dfe724-metrics-tls\") pod \"dns-operator-744455d44c-2jxng\" (UID: \"39c1c6c8-3cb7-44eb-b0cc-f218e4dfe724\") " pod="openshift-dns-operator/dns-operator-744455d44c-2jxng" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.589226 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/50e1cbc5-727f-42ca-881c-fdd0b07ca739-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jf2nl\" (UID: \"50e1cbc5-727f-42ca-881c-fdd0b07ca739\") " pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.590347 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/929dd3e5-a329-4291-8fd1-c998483026cc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tf65w\" (UID: \"929dd3e5-a329-4291-8fd1-c998483026cc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.590382 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9213c07d-c865-4fde-b2cd-f28d46033e74-apiservice-cert\") pod \"packageserver-d55dfcdfc-znqkr\" (UID: \"9213c07d-c865-4fde-b2cd-f28d46033e74\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.590563 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a02f65b-9fb4-41a5-974a-648ba0e107eb-etcd-service-ca\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.594139 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a02f65b-9fb4-41a5-974a-648ba0e107eb-serving-cert\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.594316 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50e1cbc5-727f-42ca-881c-fdd0b07ca739-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jf2nl\" (UID: \"50e1cbc5-727f-42ca-881c-fdd0b07ca739\") " pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.594162 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/69d10b7f-1714-4536-800b-e7aa5bc7b73a-stats-auth\") pod \"router-default-5444994796-xh8d5\" (UID: \"69d10b7f-1714-4536-800b-e7aa5bc7b73a\") " pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.594601 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13284ed8-fd88-4a11-81e5-b58cf9551c27-proxy-tls\") pod \"machine-config-operator-74547568cd-bwbkt\" (UID: \"13284ed8-fd88-4a11-81e5-b58cf9551c27\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.594932 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69d10b7f-1714-4536-800b-e7aa5bc7b73a-service-ca-bundle\") pod \"router-default-5444994796-xh8d5\" (UID: \"69d10b7f-1714-4536-800b-e7aa5bc7b73a\") " pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.595283 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b292f39-b8ff-4187-ae0a-14928dc18e08-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ds8zv\" (UID: \"2b292f39-b8ff-4187-ae0a-14928dc18e08\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.585297 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-registration-dir\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.595405 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e060d3c0-cba4-4930-a3af-76f7c3f5c9c1-srv-cert\") pod \"olm-operator-6b444d44fb-b9c44\" (UID: \"e060d3c0-cba4-4930-a3af-76f7c3f5c9c1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.595552 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-csi-data-dir\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.595747 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e060d3c0-cba4-4930-a3af-76f7c3f5c9c1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-b9c44\" (UID: \"e060d3c0-cba4-4930-a3af-76f7c3f5c9c1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.595862 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-plugins-dir\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.596237 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a02f65b-9fb4-41a5-974a-648ba0e107eb-config\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.596445 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c1b1a4bb-5729-4495-a10f-d5aa62f2c502-profile-collector-cert\") pod \"catalog-operator-68c6474976-t55fv\" (UID: \"c1b1a4bb-5729-4495-a10f-d5aa62f2c502\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.597307 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45-trusted-ca\") pod \"ingress-operator-5b745b69d9-94rjd\" (UID: \"5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.597993 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9213c07d-c865-4fde-b2cd-f28d46033e74-webhook-cert\") pod \"packageserver-d55dfcdfc-znqkr\" (UID: \"9213c07d-c865-4fde-b2cd-f28d46033e74\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.598007 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c4c5b58-d542-4ab5-8132-586b180392a0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtrng\" (UID: \"6c4c5b58-d542-4ab5-8132-586b180392a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.598456 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4c5b58-d542-4ab5-8132-586b180392a0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtrng\" (UID: \"6c4c5b58-d542-4ab5-8132-586b180392a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.598637 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16448b50-7f70-4571-8150-a462b3774dfc-config\") pod \"kube-controller-manager-operator-78b949d7b-252w2\" (UID: \"16448b50-7f70-4571-8150-a462b3774dfc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.598800 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9213c07d-c865-4fde-b2cd-f28d46033e74-tmpfs\") pod \"packageserver-d55dfcdfc-znqkr\" (UID: \"9213c07d-c865-4fde-b2cd-f28d46033e74\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.598961 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cde69387-7c72-4285-a5ec-79f5626eeb96-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6z2v5\" (UID: \"cde69387-7c72-4285-a5ec-79f5626eeb96\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.599476 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p558p\" (UniqueName: \"kubernetes.io/projected/d6d2a3a0-669f-41c3-8a04-1a4f7f961f1b-kube-api-access-p558p\") pod \"downloads-7954f5f757-559sf\" (UID: \"d6d2a3a0-669f-41c3-8a04-1a4f7f961f1b\") " pod="openshift-console/downloads-7954f5f757-559sf" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.599596 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45-metrics-tls\") pod \"ingress-operator-5b745b69d9-94rjd\" (UID: \"5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.599727 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6825925-568b-416e-910f-52e1ee27d741-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-g4tmm\" (UID: \"b6825925-568b-416e-910f-52e1ee27d741\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.599918 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/35911247-ad00-422c-9d30-586834a80f76-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-njv2h\" (UID: \"35911247-ad00-422c-9d30-586834a80f76\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-njv2h" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.600404 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c736576f-aae9-4d51-a058-f4ad4d95edb4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-szjk8\" (UID: \"c736576f-aae9-4d51-a058-f4ad4d95edb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-szjk8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.600632 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1a02f65b-9fb4-41a5-974a-648ba0e107eb-etcd-client\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.600977 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69d10b7f-1714-4536-800b-e7aa5bc7b73a-metrics-certs\") pod \"router-default-5444994796-xh8d5\" (UID: \"69d10b7f-1714-4536-800b-e7aa5bc7b73a\") " pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.603213 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06cf2358-4cba-4d69-81d1-dc02434fe460-secret-volume\") pod \"collect-profiles-29419740-xnvv4\" (UID: \"06cf2358-4cba-4d69-81d1-dc02434fe460\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.603688 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c1b1a4bb-5729-4495-a10f-d5aa62f2c502-srv-cert\") pod \"catalog-operator-68c6474976-t55fv\" (UID: \"c1b1a4bb-5729-4495-a10f-d5aa62f2c502\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.607738 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4c9dc880-5c96-4a70-baa5-f4628a9a19be-signing-key\") pod \"service-ca-9c57cc56f-5gpv8\" (UID: \"4c9dc880-5c96-4a70-baa5-f4628a9a19be\") " pod="openshift-service-ca/service-ca-9c57cc56f-5gpv8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.608962 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d519f527-d5bd-4e76-98de-4ff1e5720698-serving-cert\") pod \"service-ca-operator-777779d784-hzscb\" (UID: \"d519f527-d5bd-4e76-98de-4ff1e5720698\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hzscb" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.608959 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08ac05cf-c12d-4898-a9b0-451f11b44aed-proxy-tls\") pod \"machine-config-controller-84d6567774-srht8\" (UID: \"08ac05cf-c12d-4898-a9b0-451f11b44aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-srht8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.614589 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdnk7\" (UniqueName: \"kubernetes.io/projected/c2c04832-2cf3-4401-bf58-b2b5624e5c97-kube-api-access-bdnk7\") pod \"route-controller-manager-6576b87f9c-9ll9b\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.639693 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvcs6\" (UniqueName: \"kubernetes.io/projected/178bd27c-e3da-4218-9785-9d7c8b1bf89a-kube-api-access-hvcs6\") pod \"authentication-operator-69f744f599-fm86d\" (UID: \"178bd27c-e3da-4218-9785-9d7c8b1bf89a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.657153 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.658588 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c72nd\" (UniqueName: \"kubernetes.io/projected/5ea3906e-d311-4b90-80be-7405507e135e-kube-api-access-c72nd\") pod \"machine-api-operator-5694c8668f-jxkd8\" (UID: \"5ea3906e-d311-4b90-80be-7405507e135e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.677284 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzrv4\" (UniqueName: \"kubernetes.io/projected/e7c51c82-c887-4c77-bfa2-cb3c5e896751-kube-api-access-tzrv4\") pod \"openshift-apiserver-operator-796bbdcf4f-vld4h\" (UID: \"e7c51c82-c887-4c77-bfa2-cb3c5e896751\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.686990 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:10 crc kubenswrapper[4776]: E1208 09:01:10.687127 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.187109651 +0000 UTC m=+147.450334673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.687443 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: E1208 09:01:10.688079 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.188058766 +0000 UTC m=+147.451283798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.693884 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thkdq\" (UniqueName: \"kubernetes.io/projected/e981fb43-6f44-4462-b97c-f64658cd7c97-kube-api-access-thkdq\") pod \"apiserver-7bbb656c7d-lmfgv\" (UID: \"e981fb43-6f44-4462-b97c-f64658cd7c97\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.699500 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.707420 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c4d8f0e-5e42-4fd1-8c31-2cd3ffd7eeda-cert\") pod \"ingress-canary-b59h4\" (UID: \"5c4d8f0e-5e42-4fd1-8c31-2cd3ffd7eeda\") " pod="openshift-ingress-canary/ingress-canary-b59h4" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.726046 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.740709 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.740941 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.761476 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.780104 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.789025 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:10 crc kubenswrapper[4776]: E1208 09:01:10.789402 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.289377515 +0000 UTC m=+147.552602537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.789792 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: E1208 09:01:10.790625 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.290607558 +0000 UTC m=+147.553832580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.800657 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.803101 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-559sf" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.809616 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.813444 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2441969d-cfa3-4842-aeee-54625353b7bf-certs\") pod \"machine-config-server-pgq6g\" (UID: \"2441969d-cfa3-4842-aeee-54625353b7bf\") " pod="openshift-machine-config-operator/machine-config-server-pgq6g" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.814360 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fm86d"] Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.819678 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.827607 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2441969d-cfa3-4842-aeee-54625353b7bf-node-bootstrap-token\") pod \"machine-config-server-pgq6g\" (UID: \"2441969d-cfa3-4842-aeee-54625353b7bf\") " pod="openshift-machine-config-operator/machine-config-server-pgq6g" Dec 08 09:01:10 crc kubenswrapper[4776]: W1208 09:01:10.833661 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod178bd27c_e3da_4218_9785_9d7c8b1bf89a.slice/crio-90a76cc3ae97fca5b8c5462d14444bde0a147dbcfbc993daa8ba654bc4967099 WatchSource:0}: Error finding container 90a76cc3ae97fca5b8c5462d14444bde0a147dbcfbc993daa8ba654bc4967099: Status 404 returned error can't find the container with id 90a76cc3ae97fca5b8c5462d14444bde0a147dbcfbc993daa8ba654bc4967099 Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.839603 4776 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.859092 4776 request.go:700] Waited for 1.862837234s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.861136 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.880193 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.891589 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:10 crc kubenswrapper[4776]: E1208 09:01:10.892004 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.391981218 +0000 UTC m=+147.655206250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.892196 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: E1208 09:01:10.892505 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.392498252 +0000 UTC m=+147.655723274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.903100 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.905269 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.909455 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.910443 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0478469-3de4-4f1e-8853-c5e4cdfedef0-config-volume\") pod \"dns-default-dzfmd\" (UID: \"e0478469-3de4-4f1e-8853-c5e4cdfedef0\") " pod="openshift-dns/dns-default-dzfmd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.919252 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.926212 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8dm9l"] Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.933866 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0478469-3de4-4f1e-8853-c5e4cdfedef0-metrics-tls\") pod \"dns-default-dzfmd\" (UID: \"e0478469-3de4-4f1e-8853-c5e4cdfedef0\") " pod="openshift-dns/dns-default-dzfmd" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.940393 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.960244 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.969111 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-client-ca\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.984809 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.991001 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.992513 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b8fd850-14eb-419b-a1d4-e7de203c419f-serving-cert\") pod \"console-operator-58897d9998-dndwl\" (UID: \"6b8fd850-14eb-419b-a1d4-e7de203c419f\") " pod="openshift-console-operator/console-operator-58897d9998-dndwl" Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.993082 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:10 crc kubenswrapper[4776]: E1208 09:01:10.993273 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.493232136 +0000 UTC m=+147.756457148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:10 crc kubenswrapper[4776]: I1208 09:01:10.993589 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:10 crc kubenswrapper[4776]: E1208 09:01:10.994097 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.494074138 +0000 UTC m=+147.757299170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.007380 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.022076 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.048148 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-559sf"] Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.049571 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.051089 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.055039 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-txnxn"] Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.064476 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.076942 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-serving-cert\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.095071 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.095860 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:11 crc kubenswrapper[4776]: E1208 09:01:11.096060 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.596035385 +0000 UTC m=+147.859260407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.096608 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:11 crc kubenswrapper[4776]: E1208 09:01:11.097001 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.59699154 +0000 UTC m=+147.860216562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.100045 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.101896 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:11 crc kubenswrapper[4776]: W1208 09:01:11.116357 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6da64f0_d985_46de_bffa_4ae9632c0245.slice/crio-188d3874ea00db2761038346125b045f1fb23e22ef287d61cb2894a09ee0969a WatchSource:0}: Error finding container 188d3874ea00db2761038346125b045f1fb23e22ef287d61cb2894a09ee0969a: Status 404 returned error can't find the container with id 188d3874ea00db2761038346125b045f1fb23e22ef287d61cb2894a09ee0969a Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.126046 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.130877 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c0bf1894-515b-4ae6-bcf5-148f5db59022-audit\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.142765 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.144608 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jxkd8"] Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.150065 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0da3b83e-efc3-4e6d-b876-186f430d3d77-serving-cert\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.159866 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.179228 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.189960 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-client-ca\") pod \"route-controller-manager-6576b87f9c-9ll9b\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.197557 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:11 crc kubenswrapper[4776]: E1208 09:01:11.197806 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.697782795 +0000 UTC m=+147.961007817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.198016 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:11 crc kubenswrapper[4776]: E1208 09:01:11.198666 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.698658778 +0000 UTC m=+147.961883800 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.201314 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.209880 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-config\") pod \"route-controller-manager-6576b87f9c-9ll9b\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.214649 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h"] Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.221448 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.230548 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-etcd-client\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.246090 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.250932 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv"] Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.260980 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.261970 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c0bf1894-515b-4ae6-bcf5-148f5db59022-encryption-config\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.280074 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.293597 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9njh\" (UniqueName: \"kubernetes.io/projected/c0bf1894-515b-4ae6-bcf5-148f5db59022-kube-api-access-f9njh\") pod \"apiserver-76f77b778f-bk9qw\" (UID: \"c0bf1894-515b-4ae6-bcf5-148f5db59022\") " pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.298872 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:11 crc kubenswrapper[4776]: E1208 09:01:11.299427 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.799409382 +0000 UTC m=+148.062634404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.299572 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:11 crc kubenswrapper[4776]: E1208 09:01:11.299997 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.799984478 +0000 UTC m=+148.063209500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.301647 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.308839 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kthb8\" (UniqueName: \"kubernetes.io/projected/0da3b83e-efc3-4e6d-b876-186f430d3d77-kube-api-access-kthb8\") pod \"controller-manager-879f6c89f-ld6f6\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.354083 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-bound-sa-token\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.376487 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2kmn\" (UniqueName: \"kubernetes.io/projected/6b8fd850-14eb-419b-a1d4-e7de203c419f-kube-api-access-x2kmn\") pod \"console-operator-58897d9998-dndwl\" (UID: \"6b8fd850-14eb-419b-a1d4-e7de203c419f\") " pod="openshift-console-operator/console-operator-58897d9998-dndwl" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.393806 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh4fd\" (UniqueName: \"kubernetes.io/projected/7c6fbdd6-0243-4372-a986-cc73d2df8a74-kube-api-access-rh4fd\") pod \"oauth-openshift-558db77b4-2hsh8\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.401131 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:11 crc kubenswrapper[4776]: E1208 09:01:11.401437 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.90140559 +0000 UTC m=+148.164630622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.401779 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:11 crc kubenswrapper[4776]: E1208 09:01:11.402709 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:11.902598101 +0000 UTC m=+148.165823123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.407607 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.407687 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.420393 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2794h\" (UniqueName: \"kubernetes.io/projected/e296093f-3360-4db5-a967-b61c3a5cee51-kube-api-access-2794h\") pod \"cluster-image-registry-operator-dc59b4c8b-6zp8l\" (UID: \"e296093f-3360-4db5-a967-b61c3a5cee51\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.424682 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.440529 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v627z\" (UniqueName: \"kubernetes.io/projected/10462781-68cf-4a10-b7c7-b9700465d964-kube-api-access-v627z\") pod \"cluster-samples-operator-665b6dd947-59m6v\" (UID: \"10462781-68cf-4a10-b7c7-b9700465d964\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.452840 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e296093f-3360-4db5-a967-b61c3a5cee51-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6zp8l\" (UID: \"e296093f-3360-4db5-a967-b61c3a5cee51\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.467416 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.475237 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pclzp\" (UniqueName: \"kubernetes.io/projected/332b83f9-1f6d-4563-9be3-96003033621d-kube-api-access-pclzp\") pod \"machine-approver-56656f9798-44sjg\" (UID: \"332b83f9-1f6d-4563-9be3-96003033621d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.477066 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-dndwl" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.487620 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.494830 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26659\" (UniqueName: \"kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-kube-api-access-26659\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.503013 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:11 crc kubenswrapper[4776]: E1208 09:01:11.503589 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:12.003575361 +0000 UTC m=+148.266800383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.512207 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" event={"ID":"e981fb43-6f44-4462-b97c-f64658cd7c97","Type":"ContainerStarted","Data":"dd4893d0ddc161cfbe9e7ee1deddc3c8455f9c03aca4d2c63a6b96c3267fff7e"} Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.515686 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h" event={"ID":"e7c51c82-c887-4c77-bfa2-cb3c5e896751","Type":"ContainerStarted","Data":"6cea2bd94d04a2dead7916408e7dc941f1ab9725c9953e7fff5203fa4a9aa1e7"} Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.515727 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h" event={"ID":"e7c51c82-c887-4c77-bfa2-cb3c5e896751","Type":"ContainerStarted","Data":"6d9877b21409e650cb9f9834a884e06d5b838a0df1f93db0137dea4a43817821"} Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.518193 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r5qp\" (UniqueName: \"kubernetes.io/projected/08ac05cf-c12d-4898-a9b0-451f11b44aed-kube-api-access-6r5qp\") pod \"machine-config-controller-84d6567774-srht8\" (UID: \"08ac05cf-c12d-4898-a9b0-451f11b44aed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-srht8" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.523835 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" event={"ID":"5ea3906e-d311-4b90-80be-7405507e135e","Type":"ContainerStarted","Data":"53b5c98e062e0689e65f8af161133dce94017a5baee9763698219a14726cab59"} Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.523866 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" event={"ID":"5ea3906e-d311-4b90-80be-7405507e135e","Type":"ContainerStarted","Data":"261e54873c13d9115644ae354ee688c60731cc26c787bc51fb341a53f819b3e0"} Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.525013 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8dm9l" event={"ID":"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7","Type":"ContainerStarted","Data":"6c7eb012cb7d51a3b063f85749435e8197befa048320da5ee4fd8e3835a4e382"} Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.525034 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8dm9l" event={"ID":"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7","Type":"ContainerStarted","Data":"2417e37c033a85edb2c7086c7ee3a96dde2b860e19053f0a335f62b734cd3c84"} Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.536416 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rg8m\" (UniqueName: \"kubernetes.io/projected/06cf2358-4cba-4d69-81d1-dc02434fe460-kube-api-access-6rg8m\") pod \"collect-profiles-29419740-xnvv4\" (UID: \"06cf2358-4cba-4d69-81d1-dc02434fe460\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.537812 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.547820 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.551311 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" event={"ID":"178bd27c-e3da-4218-9785-9d7c8b1bf89a","Type":"ContainerStarted","Data":"2a944c5846cc4d9188545a35792d544fd196874f7d407a932c417ed9a8128739"} Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.551363 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" event={"ID":"178bd27c-e3da-4218-9785-9d7c8b1bf89a","Type":"ContainerStarted","Data":"90a76cc3ae97fca5b8c5462d14444bde0a147dbcfbc993daa8ba654bc4967099"} Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.554770 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-srht8" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.556616 4776 generic.go:334] "Generic (PLEG): container finished" podID="f6da64f0-d985-46de-bffa-4ae9632c0245" containerID="27afdd26f63ac717c5f5d7aa0cd419d62dd8b7a6dc64a6ca124561acdd56a6ea" exitCode=0 Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.556965 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" event={"ID":"f6da64f0-d985-46de-bffa-4ae9632c0245","Type":"ContainerDied","Data":"27afdd26f63ac717c5f5d7aa0cd419d62dd8b7a6dc64a6ca124561acdd56a6ea"} Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.557000 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" event={"ID":"f6da64f0-d985-46de-bffa-4ae9632c0245","Type":"ContainerStarted","Data":"188d3874ea00db2761038346125b045f1fb23e22ef287d61cb2894a09ee0969a"} Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.559559 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nht9\" (UniqueName: \"kubernetes.io/projected/c736576f-aae9-4d51-a058-f4ad4d95edb4-kube-api-access-2nht9\") pod \"multus-admission-controller-857f4d67dd-szjk8\" (UID: \"c736576f-aae9-4d51-a058-f4ad4d95edb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-szjk8" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.570779 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.577345 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-559sf" event={"ID":"d6d2a3a0-669f-41c3-8a04-1a4f7f961f1b","Type":"ContainerStarted","Data":"0e401ea5aee4d0609d3a377cf7762ec76c701665f0e0d58ff6659be84162231d"} Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.577377 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-559sf" event={"ID":"d6d2a3a0-669f-41c3-8a04-1a4f7f961f1b","Type":"ContainerStarted","Data":"a182e3836e41d40aa61ffaf551da63266582daefe4d74667e2cb000a857c247f"} Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.578205 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-559sf" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.580216 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45-bound-sa-token\") pod \"ingress-operator-5b745b69d9-94rjd\" (UID: \"5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.583724 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-559sf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.583773 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-559sf" podUID="d6d2a3a0-669f-41c3-8a04-1a4f7f961f1b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.592257 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d680fdc42e08ca9eec21d89c65241b521a7c5d59facad0e6a519c5033b88d36c"} Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.604587 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.610256 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcl6r\" (UniqueName: \"kubernetes.io/projected/69d10b7f-1714-4536-800b-e7aa5bc7b73a-kube-api-access-mcl6r\") pod \"router-default-5444994796-xh8d5\" (UID: \"69d10b7f-1714-4536-800b-e7aa5bc7b73a\") " pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:11 crc kubenswrapper[4776]: E1208 09:01:11.611157 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:12.111139607 +0000 UTC m=+148.374364629 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.626030 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7tm7\" (UniqueName: \"kubernetes.io/projected/4c9dc880-5c96-4a70-baa5-f4628a9a19be-kube-api-access-g7tm7\") pod \"service-ca-9c57cc56f-5gpv8\" (UID: \"4c9dc880-5c96-4a70-baa5-f4628a9a19be\") " pod="openshift-service-ca/service-ca-9c57cc56f-5gpv8" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.640837 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjdx7\" (UniqueName: \"kubernetes.io/projected/deb5eac5-1137-42a3-a5ad-e52a2d822cba-kube-api-access-cjdx7\") pod \"migrator-59844c95c7-ckhmx\" (UID: \"deb5eac5-1137-42a3-a5ad-e52a2d822cba\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckhmx" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.657322 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.668677 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9585\" (UniqueName: \"kubernetes.io/projected/6c4c5b58-d542-4ab5-8132-586b180392a0-kube-api-access-z9585\") pod \"openshift-controller-manager-operator-756b6f6bc6-rtrng\" (UID: \"6c4c5b58-d542-4ab5-8132-586b180392a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.669323 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bk9qw"] Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.688127 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxgpx\" (UniqueName: \"kubernetes.io/projected/9213c07d-c865-4fde-b2cd-f28d46033e74-kube-api-access-zxgpx\") pod \"packageserver-d55dfcdfc-znqkr\" (UID: \"9213c07d-c865-4fde-b2cd-f28d46033e74\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.710061 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:11 crc kubenswrapper[4776]: E1208 09:01:11.710601 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:12.210573816 +0000 UTC m=+148.473798838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.710948 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.710953 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhx74\" (UniqueName: \"kubernetes.io/projected/cde69387-7c72-4285-a5ec-79f5626eeb96-kube-api-access-hhx74\") pod \"package-server-manager-789f6589d5-6z2v5\" (UID: \"cde69387-7c72-4285-a5ec-79f5626eeb96\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5" Dec 08 09:01:11 crc kubenswrapper[4776]: E1208 09:01:11.711641 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:12.211623934 +0000 UTC m=+148.474848956 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.718147 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.720128 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r74b\" (UniqueName: \"kubernetes.io/projected/d519f527-d5bd-4e76-98de-4ff1e5720698-kube-api-access-9r74b\") pod \"service-ca-operator-777779d784-hzscb\" (UID: \"d519f527-d5bd-4e76-98de-4ff1e5720698\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hzscb" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.732709 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.749277 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.758893 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndlzk\" (UniqueName: \"kubernetes.io/projected/5c4d8f0e-5e42-4fd1-8c31-2cd3ffd7eeda-kube-api-access-ndlzk\") pod \"ingress-canary-b59h4\" (UID: \"5c4d8f0e-5e42-4fd1-8c31-2cd3ffd7eeda\") " pod="openshift-ingress-canary/ingress-canary-b59h4" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.781949 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckhmx" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.782728 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5qdw\" (UniqueName: \"kubernetes.io/projected/1a02f65b-9fb4-41a5-974a-648ba0e107eb-kube-api-access-w5qdw\") pod \"etcd-operator-b45778765-mbv9b\" (UID: \"1a02f65b-9fb4-41a5-974a-648ba0e107eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.789502 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.797480 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hzscb" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.804521 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.820713 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-szjk8" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.822030 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:11 crc kubenswrapper[4776]: E1208 09:01:11.822597 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:12.322570108 +0000 UTC m=+148.585795140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.824675 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/929dd3e5-a329-4291-8fd1-c998483026cc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tf65w\" (UID: \"929dd3e5-a329-4291-8fd1-c998483026cc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.828800 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn7kf\" (UniqueName: \"kubernetes.io/projected/04b6f5ef-a04c-47c5-b35d-700ed59c5ac9-kube-api-access-gn7kf\") pod \"csi-hostpathplugin-b2vfk\" (UID: \"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9\") " pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.828987 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncgqv\" (UniqueName: \"kubernetes.io/projected/e0478469-3de4-4f1e-8853-c5e4cdfedef0-kube-api-access-ncgqv\") pod \"dns-default-dzfmd\" (UID: \"e0478469-3de4-4f1e-8853-c5e4cdfedef0\") " pod="openshift-dns/dns-default-dzfmd" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.840607 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs9k6\" (UniqueName: \"kubernetes.io/projected/e060d3c0-cba4-4930-a3af-76f7c3f5c9c1-kube-api-access-hs9k6\") pod \"olm-operator-6b444d44fb-b9c44\" (UID: \"e060d3c0-cba4-4930-a3af-76f7c3f5c9c1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.860830 4776 request.go:700] Waited for 1.272122066s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/serviceaccounts/marketplace-operator/token Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.874193 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b292f39-b8ff-4187-ae0a-14928dc18e08-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ds8zv\" (UID: \"2b292f39-b8ff-4187-ae0a-14928dc18e08\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.881988 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpp6m\" (UniqueName: \"kubernetes.io/projected/50e1cbc5-727f-42ca-881c-fdd0b07ca739-kube-api-access-rpp6m\") pod \"marketplace-operator-79b997595-jf2nl\" (UID: \"50e1cbc5-727f-42ca-881c-fdd0b07ca739\") " pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.891161 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b59h4" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.894278 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5gpv8" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.915945 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.916292 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfgd4\" (UniqueName: \"kubernetes.io/projected/b6825925-568b-416e-910f-52e1ee27d741-kube-api-access-dfgd4\") pod \"kube-storage-version-migrator-operator-b67b599dd-g4tmm\" (UID: \"b6825925-568b-416e-910f-52e1ee27d741\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.924400 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:11 crc kubenswrapper[4776]: E1208 09:01:11.924763 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:12.424751801 +0000 UTC m=+148.687976823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.925900 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dzfmd" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.941528 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gclh\" (UniqueName: \"kubernetes.io/projected/39c1c6c8-3cb7-44eb-b0cc-f218e4dfe724-kube-api-access-9gclh\") pod \"dns-operator-744455d44c-2jxng\" (UID: \"39c1c6c8-3cb7-44eb-b0cc-f218e4dfe724\") " pod="openshift-dns-operator/dns-operator-744455d44c-2jxng" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.947377 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbfvz\" (UniqueName: \"kubernetes.io/projected/35911247-ad00-422c-9d30-586834a80f76-kube-api-access-fbfvz\") pod \"control-plane-machine-set-operator-78cbb6b69f-njv2h\" (UID: \"35911247-ad00-422c-9d30-586834a80f76\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-njv2h" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.969479 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29tgx\" (UniqueName: \"kubernetes.io/projected/5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45-kube-api-access-29tgx\") pod \"ingress-operator-5b745b69d9-94rjd\" (UID: \"5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.976495 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vjf8\" (UniqueName: \"kubernetes.io/projected/c1b1a4bb-5729-4495-a10f-d5aa62f2c502-kube-api-access-8vjf8\") pod \"catalog-operator-68c6474976-t55fv\" (UID: \"c1b1a4bb-5729-4495-a10f-d5aa62f2c502\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" Dec 08 09:01:11 crc kubenswrapper[4776]: I1208 09:01:11.993032 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dndwl"] Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.024046 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16448b50-7f70-4571-8150-a462b3774dfc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-252w2\" (UID: \"16448b50-7f70-4571-8150-a462b3774dfc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.025716 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.026096 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:12 crc kubenswrapper[4776]: E1208 09:01:12.026267 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:12.526225984 +0000 UTC m=+148.789451006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.026675 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:12 crc kubenswrapper[4776]: E1208 09:01:12.028663 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:12.528650199 +0000 UTC m=+148.791875221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.036877 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4mhj\" (UniqueName: \"kubernetes.io/projected/2441969d-cfa3-4842-aeee-54625353b7bf-kube-api-access-d4mhj\") pod \"machine-config-server-pgq6g\" (UID: \"2441969d-cfa3-4842-aeee-54625353b7bf\") " pod="openshift-machine-config-operator/machine-config-server-pgq6g" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.040339 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-2jxng" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.055497 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2fv9\" (UniqueName: \"kubernetes.io/projected/13284ed8-fd88-4a11-81e5-b58cf9551c27-kube-api-access-w2fv9\") pod \"machine-config-operator-74547568cd-bwbkt\" (UID: \"13284ed8-fd88-4a11-81e5-b58cf9551c27\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.056841 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.069584 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.072764 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.081279 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ld6f6"] Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.116004 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b"] Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.118780 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.126158 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.127801 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:12 crc kubenswrapper[4776]: E1208 09:01:12.128025 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:12.628000235 +0000 UTC m=+148.891225257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.133986 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.142868 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-njv2h" Dec 08 09:01:12 crc kubenswrapper[4776]: W1208 09:01:12.143363 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b8fd850_14eb_419b_a1d4_e7de203c419f.slice/crio-16f963ac51c0c81c7f10a8b50833f9734ed71e6b204aef797be807fe8e07c323 WatchSource:0}: Error finding container 16f963ac51c0c81c7f10a8b50833f9734ed71e6b204aef797be807fe8e07c323: Status 404 returned error can't find the container with id 16f963ac51c0c81c7f10a8b50833f9734ed71e6b204aef797be807fe8e07c323 Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.147361 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.166596 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.177707 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.198918 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pgq6g" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.228864 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:12 crc kubenswrapper[4776]: E1208 09:01:12.229188 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:12.72916153 +0000 UTC m=+148.992386552 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.329935 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:12 crc kubenswrapper[4776]: E1208 09:01:12.330558 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:12.83050364 +0000 UTC m=+149.093728662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.431744 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:12 crc kubenswrapper[4776]: E1208 09:01:12.434732 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:12.934707726 +0000 UTC m=+149.197932738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.542703 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:12 crc kubenswrapper[4776]: E1208 09:01:12.543104 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:13.043089253 +0000 UTC m=+149.306314275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.587992 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-srht8"] Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.612119 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-dndwl" event={"ID":"6b8fd850-14eb-419b-a1d4-e7de203c419f","Type":"ContainerStarted","Data":"16f963ac51c0c81c7f10a8b50833f9734ed71e6b204aef797be807fe8e07c323"} Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.616285 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xh8d5" event={"ID":"69d10b7f-1714-4536-800b-e7aa5bc7b73a","Type":"ContainerStarted","Data":"11f849046267194fe2b2432b328139d95ea632378a363afac3d5f0bdbbfb16d7"} Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.616324 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xh8d5" event={"ID":"69d10b7f-1714-4536-800b-e7aa5bc7b73a","Type":"ContainerStarted","Data":"7de9c368def8a4a08e4c2e9cdc807d93db8b063b0dd5c92e8d3e70465ae932f8"} Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.617455 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" event={"ID":"0da3b83e-efc3-4e6d-b876-186f430d3d77","Type":"ContainerStarted","Data":"0a36975cbf5d05274abe9c0b361044ed57ac60e13bdb6247efc449a2fadeab20"} Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.618931 4776 generic.go:334] "Generic (PLEG): container finished" podID="e981fb43-6f44-4462-b97c-f64658cd7c97" containerID="a72fdb508997baeb7ae792b925f74fa988f2c6d68875acba53fe54b5878a9208" exitCode=0 Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.619012 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" event={"ID":"e981fb43-6f44-4462-b97c-f64658cd7c97","Type":"ContainerDied","Data":"a72fdb508997baeb7ae792b925f74fa988f2c6d68875acba53fe54b5878a9208"} Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.620355 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" event={"ID":"332b83f9-1f6d-4563-9be3-96003033621d","Type":"ContainerStarted","Data":"3f388bdcc6d65a96728a9bfd946c8592d518d313e183baf55afd3f4840df3674"} Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.620381 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" event={"ID":"332b83f9-1f6d-4563-9be3-96003033621d","Type":"ContainerStarted","Data":"636464880cd52abc194d9f5936491982cfde92f91ac404e820cfe8894cbe66ce"} Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.621187 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" event={"ID":"c2c04832-2cf3-4401-bf58-b2b5624e5c97","Type":"ContainerStarted","Data":"f2fee9ed6dd97a2ab0647414da25e0599a5acb29dc562ab73b69177c19a1d7e3"} Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.622883 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" event={"ID":"c0bf1894-515b-4ae6-bcf5-148f5db59022","Type":"ContainerStarted","Data":"b84fc412a2214a7df612bb963c98de602c99472ab93c6c8ec8eb0120b502fa84"} Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.622936 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" event={"ID":"c0bf1894-515b-4ae6-bcf5-148f5db59022","Type":"ContainerStarted","Data":"e891df21d77ff6cf5804957672337d4cd0c2b8725969bbb2721a087909fd3885"} Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.627485 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pgq6g" event={"ID":"2441969d-cfa3-4842-aeee-54625353b7bf","Type":"ContainerStarted","Data":"629ce26f9a7c5919dbd9badf7f44c36b006dc86caa0be4d77ce212c789c30071"} Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.640022 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" event={"ID":"f6da64f0-d985-46de-bffa-4ae9632c0245","Type":"ContainerStarted","Data":"1b37c9b2aa4a4e95bfa1d64082c72b9ffe27980128780983c3a87f3966ea8339"} Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.640507 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.644624 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" event={"ID":"5ea3906e-d311-4b90-80be-7405507e135e","Type":"ContainerStarted","Data":"02ab9905470c206369ec304020ac8db6ece287bffabc127cb589b6b8033e6a1f"} Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.645026 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:12 crc kubenswrapper[4776]: E1208 09:01:12.645512 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:13.145495401 +0000 UTC m=+149.408720423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.645982 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-559sf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.646024 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-559sf" podUID="d6d2a3a0-669f-41c3-8a04-1a4f7f961f1b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.745818 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:12 crc kubenswrapper[4776]: E1208 09:01:12.747958 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:13.247934669 +0000 UTC m=+149.511159721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.824318 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.850819 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:12 crc kubenswrapper[4776]: E1208 09:01:12.851663 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:13.351643102 +0000 UTC m=+149.614868134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:12 crc kubenswrapper[4776]: I1208 09:01:12.952192 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:12 crc kubenswrapper[4776]: E1208 09:01:12.952730 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:13.452710316 +0000 UTC m=+149.715935338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.022604 4776 patch_prober.go:28] interesting pod/router-default-5444994796-xh8d5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 09:01:13 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 08 09:01:13 crc kubenswrapper[4776]: [+]process-running ok Dec 08 09:01:13 crc kubenswrapper[4776]: healthz check failed Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.022665 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xh8d5" podUID="69d10b7f-1714-4536-800b-e7aa5bc7b73a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 09:01:13 crc kubenswrapper[4776]: W1208 09:01:13.027081 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08ac05cf_c12d_4898_a9b0_451f11b44aed.slice/crio-b77b970db15ad205b70c6cf73c49dd3d3578786132ac6c363c75b73846f4217a WatchSource:0}: Error finding container b77b970db15ad205b70c6cf73c49dd3d3578786132ac6c363c75b73846f4217a: Status 404 returned error can't find the container with id b77b970db15ad205b70c6cf73c49dd3d3578786132ac6c363c75b73846f4217a Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.054947 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:13 crc kubenswrapper[4776]: E1208 09:01:13.056450 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:13.556431689 +0000 UTC m=+149.819656711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.158224 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:13 crc kubenswrapper[4776]: E1208 09:01:13.158656 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:13.658626511 +0000 UTC m=+149.921851533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.159132 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:13 crc kubenswrapper[4776]: E1208 09:01:13.159592 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:13.659575065 +0000 UTC m=+149.922800087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.263166 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:13 crc kubenswrapper[4776]: E1208 09:01:13.263524 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:13.763503915 +0000 UTC m=+150.026728927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.364672 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:13 crc kubenswrapper[4776]: E1208 09:01:13.365868 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:13.865848141 +0000 UTC m=+150.129073163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.466370 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:13 crc kubenswrapper[4776]: E1208 09:01:13.466598 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:13.966541663 +0000 UTC m=+150.229766685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.466736 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:13 crc kubenswrapper[4776]: E1208 09:01:13.467137 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:13.967116568 +0000 UTC m=+150.230341590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.516899 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-xh8d5" podStartSLOduration=130.516872049 podStartE2EDuration="2m10.516872049s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:13.513296455 +0000 UTC m=+149.776521477" watchObservedRunningTime="2025-12-08 09:01:13.516872049 +0000 UTC m=+149.780097071" Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.569991 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:13 crc kubenswrapper[4776]: E1208 09:01:13.570661 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:14.070645866 +0000 UTC m=+150.333870888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.616733 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vld4h" podStartSLOduration=130.61671859 podStartE2EDuration="2m10.61671859s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:13.56924016 +0000 UTC m=+149.832465182" watchObservedRunningTime="2025-12-08 09:01:13.61671859 +0000 UTC m=+149.879943612" Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.645189 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-fm86d" podStartSLOduration=130.645161945 podStartE2EDuration="2m10.645161945s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:13.61747482 +0000 UTC m=+149.880699842" watchObservedRunningTime="2025-12-08 09:01:13.645161945 +0000 UTC m=+149.908386967" Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.675226 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:13 crc kubenswrapper[4776]: E1208 09:01:13.675603 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:14.175591643 +0000 UTC m=+150.438816665 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.679299 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2hsh8"] Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.688759 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-srht8" event={"ID":"08ac05cf-c12d-4898-a9b0-451f11b44aed","Type":"ContainerStarted","Data":"a966ea5413892eb725e32f4c6e3cea1d6bd9114fc72e44ecfaf47629d48e5a84"} Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.688801 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-srht8" event={"ID":"08ac05cf-c12d-4898-a9b0-451f11b44aed","Type":"ContainerStarted","Data":"b77b970db15ad205b70c6cf73c49dd3d3578786132ac6c363c75b73846f4217a"} Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.696860 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4"] Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.712948 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" event={"ID":"0da3b83e-efc3-4e6d-b876-186f430d3d77","Type":"ContainerStarted","Data":"f647a946cfc0edec4d9245ad0d9fe8df27b9dcede1b57c82239afb3122c6392f"} Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.713464 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.727663 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.729206 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" event={"ID":"e981fb43-6f44-4462-b97c-f64658cd7c97","Type":"ContainerStarted","Data":"1d51ea15676f45451f559bbd64ed3f5bc48e13acc60dc738e2adf410d157f155"} Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.738921 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" event={"ID":"c2c04832-2cf3-4401-bf58-b2b5624e5c97","Type":"ContainerStarted","Data":"e1fd7654e1547b696356eb9e25f3eaef31623c4a66478f356828e257f4725f4f"} Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.739414 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.753872 4776 generic.go:334] "Generic (PLEG): container finished" podID="c0bf1894-515b-4ae6-bcf5-148f5db59022" containerID="b84fc412a2214a7df612bb963c98de602c99472ab93c6c8ec8eb0120b502fa84" exitCode=0 Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.753926 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" event={"ID":"c0bf1894-515b-4ae6-bcf5-148f5db59022","Type":"ContainerDied","Data":"b84fc412a2214a7df612bb963c98de602c99472ab93c6c8ec8eb0120b502fa84"} Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.753952 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" event={"ID":"c0bf1894-515b-4ae6-bcf5-148f5db59022","Type":"ContainerStarted","Data":"9d1e839a5886243c98c59a71e84453ee8c6e35e59646dbbc3fcca8bd3fc06f7b"} Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.753961 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" event={"ID":"c0bf1894-515b-4ae6-bcf5-148f5db59022","Type":"ContainerStarted","Data":"da32dccdd703652d14dc92c13e22db38585c1bbe02788bf46aca974996b24bf6"} Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.772111 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pgq6g" event={"ID":"2441969d-cfa3-4842-aeee-54625353b7bf","Type":"ContainerStarted","Data":"acdffc542edc77b0445e68bc5476a1ba25d1d1badc0cc9568c05aa3621847751"} Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.778896 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" event={"ID":"332b83f9-1f6d-4563-9be3-96003033621d","Type":"ContainerStarted","Data":"d0f01ef36c27c3ace7d0461d16733146460cf4e206c813b5e33438b1750885f5"} Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.788959 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-dndwl" event={"ID":"6b8fd850-14eb-419b-a1d4-e7de203c419f","Type":"ContainerStarted","Data":"fa2b28ba6be61f3efc4cd46789c054ca99ae2435f0299ac370b390b3566b21eb"} Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.795993 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-dndwl" Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.808498 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-559sf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.838817 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:13 crc kubenswrapper[4776]: E1208 09:01:13.839761 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:14.339736369 +0000 UTC m=+150.602961391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.839924 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.808577 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-559sf" podUID="d6d2a3a0-669f-41c3-8a04-1a4f7f961f1b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Dec 08 09:01:13 crc kubenswrapper[4776]: E1208 09:01:13.852606 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:14.35258325 +0000 UTC m=+150.615808272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.854005 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l"] Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.860568 4776 patch_prober.go:28] interesting pod/router-default-5444994796-xh8d5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 09:01:13 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 08 09:01:13 crc kubenswrapper[4776]: [+]process-running ok Dec 08 09:01:13 crc kubenswrapper[4776]: healthz check failed Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.860606 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xh8d5" podUID="69d10b7f-1714-4536-800b-e7aa5bc7b73a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.865201 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ckhmx"] Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.865969 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5"] Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.880887 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-dndwl" Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.940320 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v"] Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.940751 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:13 crc kubenswrapper[4776]: E1208 09:01:13.941544 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:14.441528451 +0000 UTC m=+150.704753473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:13 crc kubenswrapper[4776]: I1208 09:01:13.941755 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:13 crc kubenswrapper[4776]: E1208 09:01:13.952904 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:14.452889703 +0000 UTC m=+150.716114725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.058112 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:14 crc kubenswrapper[4776]: E1208 09:01:14.058410 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:14.558396352 +0000 UTC m=+150.821621374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.065719 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-jxkd8" podStartSLOduration=131.065694287 podStartE2EDuration="2m11.065694287s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:13.998617326 +0000 UTC m=+150.261842348" watchObservedRunningTime="2025-12-08 09:01:14.065694287 +0000 UTC m=+150.328919309" Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.090701 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.152630 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-8dm9l" podStartSLOduration=131.152614554 podStartE2EDuration="2m11.152614554s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:14.109381766 +0000 UTC m=+150.372606798" watchObservedRunningTime="2025-12-08 09:01:14.152614554 +0000 UTC m=+150.415839576" Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.153373 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" podStartSLOduration=131.153368923 podStartE2EDuration="2m11.153368923s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:14.151544936 +0000 UTC m=+150.414769958" watchObservedRunningTime="2025-12-08 09:01:14.153368923 +0000 UTC m=+150.416593935" Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.162453 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:14 crc kubenswrapper[4776]: E1208 09:01:14.162798 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:14.662787354 +0000 UTC m=+150.926012376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.245579 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-559sf" podStartSLOduration=131.24555715 podStartE2EDuration="2m11.24555715s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:14.206883304 +0000 UTC m=+150.470108326" watchObservedRunningTime="2025-12-08 09:01:14.24555715 +0000 UTC m=+150.508782172" Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.273758 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:14 crc kubenswrapper[4776]: E1208 09:01:14.274117 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:14.774102528 +0000 UTC m=+151.037327550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.376308 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:14 crc kubenswrapper[4776]: E1208 09:01:14.376779 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:14.876765013 +0000 UTC m=+151.139990035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.378400 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-44sjg" podStartSLOduration=131.378378355 podStartE2EDuration="2m11.378378355s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:14.377113592 +0000 UTC m=+150.640338614" watchObservedRunningTime="2025-12-08 09:01:14.378378355 +0000 UTC m=+150.641603377" Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.454589 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dzfmd"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.460648 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" podStartSLOduration=130.460625549 podStartE2EDuration="2m10.460625549s" podCreationTimestamp="2025-12-08 08:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:14.446922695 +0000 UTC m=+150.710147727" watchObservedRunningTime="2025-12-08 09:01:14.460625549 +0000 UTC m=+150.723850571" Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.477559 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:14 crc kubenswrapper[4776]: E1208 09:01:14.477792 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:14.977777874 +0000 UTC m=+151.241002896 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.591643 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:14 crc kubenswrapper[4776]: E1208 09:01:14.592037 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:15.092022237 +0000 UTC m=+151.355247269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.606108 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jf2nl"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.624899 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-pgq6g" podStartSLOduration=6.624877558 podStartE2EDuration="6.624877558s" podCreationTimestamp="2025-12-08 09:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:14.598391645 +0000 UTC m=+150.861616667" watchObservedRunningTime="2025-12-08 09:01:14.624877558 +0000 UTC m=+150.888102570" Dec 08 09:01:14 crc kubenswrapper[4776]: W1208 09:01:14.667967 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c4d8f0e_5e42_4fd1_8c31_2cd3ffd7eeda.slice/crio-4144490ba1f419888bd6e164a0a2641dc63250784b7d0aecf2ebc4f5de3caa5e WatchSource:0}: Error finding container 4144490ba1f419888bd6e164a0a2641dc63250784b7d0aecf2ebc4f5de3caa5e: Status 404 returned error can't find the container with id 4144490ba1f419888bd6e164a0a2641dc63250784b7d0aecf2ebc4f5de3caa5e Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.670220 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b59h4"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.689217 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.693015 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:14 crc kubenswrapper[4776]: E1208 09:01:14.693462 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:15.193438388 +0000 UTC m=+151.456663410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.714222 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.738460 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.756905 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hzscb"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.766234 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.783280 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2jxng"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.792450 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mbv9b"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.801247 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:14 crc kubenswrapper[4776]: E1208 09:01:14.803231 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:15.303217262 +0000 UTC m=+151.566442274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.805741 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.806469 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" podStartSLOduration=131.806450238 podStartE2EDuration="2m11.806450238s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:14.727822611 +0000 UTC m=+150.991047643" watchObservedRunningTime="2025-12-08 09:01:14.806450238 +0000 UTC m=+151.069675260" Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.816040 4776 patch_prober.go:28] interesting pod/router-default-5444994796-xh8d5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 09:01:14 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 08 09:01:14 crc kubenswrapper[4776]: [+]process-running ok Dec 08 09:01:14 crc kubenswrapper[4776]: healthz check failed Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.816080 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xh8d5" podUID="69d10b7f-1714-4536-800b-e7aa5bc7b73a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.821215 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-dndwl" podStartSLOduration=131.82119345 podStartE2EDuration="2m11.82119345s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:14.77565748 +0000 UTC m=+151.038882512" watchObservedRunningTime="2025-12-08 09:01:14.82119345 +0000 UTC m=+151.084418472" Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.825344 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.832900 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" event={"ID":"7c6fbdd6-0243-4372-a986-cc73d2df8a74","Type":"ContainerStarted","Data":"e5b9823870b0724c69f69dbb71118b122bd7b086be17c88bfdcb3ca21531d179"} Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.832935 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" event={"ID":"7c6fbdd6-0243-4372-a986-cc73d2df8a74","Type":"ContainerStarted","Data":"c704cd6bbc80e498fc518112b77621974ee6cb8ba5c965c7c86eb9ff3cfb3abb"} Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.833756 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.835290 4776 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2hsh8 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" start-of-body= Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.835351 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" podUID="7c6fbdd6-0243-4372-a986-cc73d2df8a74" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.848261 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v" event={"ID":"10462781-68cf-4a10-b7c7-b9700465d964","Type":"ContainerStarted","Data":"a6d878c8fbc260a5a65029c4ee38e694f9644c2d856e2dabc80689f3095808ed"} Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.848296 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v" event={"ID":"10462781-68cf-4a10-b7c7-b9700465d964","Type":"ContainerStarted","Data":"65fea3378345ad54e9dba50e1f0bea703842d71a83bd774f60dc376cc3118bda"} Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.857632 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-szjk8"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.864519 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-njv2h"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.879585 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" event={"ID":"1a02f65b-9fb4-41a5-974a-648ba0e107eb","Type":"ContainerStarted","Data":"2b8e1105beeb4697b344733c67f63390e89784a223a63a8f8c3820f09c0be553"} Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.905159 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.905228 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5gpv8"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.905238 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.905639 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" podStartSLOduration=130.905629721 podStartE2EDuration="2m10.905629721s" podCreationTimestamp="2025-12-08 08:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:14.880126304 +0000 UTC m=+151.143351326" watchObservedRunningTime="2025-12-08 09:01:14.905629721 +0000 UTC m=+151.168854743" Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.914335 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:14 crc kubenswrapper[4776]: E1208 09:01:14.915722 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:15.415706298 +0000 UTC m=+151.678931320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.917151 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.922206 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.922271 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-b2vfk"] Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.939454 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" event={"ID":"50e1cbc5-727f-42ca-881c-fdd0b07ca739","Type":"ContainerStarted","Data":"387f6c13d8ddc52c4b6f49d3f003b41a4e0e766c93075bd4c99a1c80a6d9fd5a"} Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.985383 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" podStartSLOduration=131.985367227 podStartE2EDuration="2m11.985367227s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:14.972645709 +0000 UTC m=+151.235870761" watchObservedRunningTime="2025-12-08 09:01:14.985367227 +0000 UTC m=+151.248592239" Dec 08 09:01:14 crc kubenswrapper[4776]: I1208 09:01:14.986950 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" podStartSLOduration=131.986941969 podStartE2EDuration="2m11.986941969s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:14.930097129 +0000 UTC m=+151.193322161" watchObservedRunningTime="2025-12-08 09:01:14.986941969 +0000 UTC m=+151.250166991" Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.003262 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dzfmd" event={"ID":"e0478469-3de4-4f1e-8853-c5e4cdfedef0","Type":"ContainerStarted","Data":"e99d174ced6513cbfe9ab770b3af135c9afba8a000cd4ab8b785f007f39d41c9"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.014736 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b59h4" event={"ID":"5c4d8f0e-5e42-4fd1-8c31-2cd3ffd7eeda","Type":"ContainerStarted","Data":"4144490ba1f419888bd6e164a0a2641dc63250784b7d0aecf2ebc4f5de3caa5e"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.016075 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:15 crc kubenswrapper[4776]: E1208 09:01:15.016366 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:15.516353269 +0000 UTC m=+151.779578291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.017999 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" event={"ID":"c1b1a4bb-5729-4495-a10f-d5aa62f2c502","Type":"ContainerStarted","Data":"e02465121ebc2d18f777ea62f80b9fdbb2a2bf04db90233f2aa6cc3f291ebb54"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.021029 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" event={"ID":"e296093f-3360-4db5-a967-b61c3a5cee51","Type":"ContainerStarted","Data":"300b23f79f4fb66babaa0af520a59ab05ab322db9ffa97bce72594cbeaae7768"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.021057 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" event={"ID":"e296093f-3360-4db5-a967-b61c3a5cee51","Type":"ContainerStarted","Data":"b2ae3c1d92fc1ab3210c9d004de282639b0fbf71dc9b6ad02dba6d981192d3e3"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.036437 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2jxng" event={"ID":"39c1c6c8-3cb7-44eb-b0cc-f218e4dfe724","Type":"ContainerStarted","Data":"1ac623fbc514f537e7ac69c7f035a6ff9e4cce0f0b1878bcfe0f86beb00a4242"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.041119 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" event={"ID":"9213c07d-c865-4fde-b2cd-f28d46033e74","Type":"ContainerStarted","Data":"99f89934cdfb22aaf2a8d324b051a381f90b950e3c5ec09eb28ae78d5756f5bd"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.107224 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" event={"ID":"06cf2358-4cba-4d69-81d1-dc02434fe460","Type":"ContainerStarted","Data":"14510ab17084a5d159b27749b9a50aff2bde80a03fc6bf1ac4cb8f9804f42f25"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.107350 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" event={"ID":"06cf2358-4cba-4d69-81d1-dc02434fe460","Type":"ContainerStarted","Data":"cc268b93a598f2b03c7028451835c5e72cf5121158eac57d4aeb4c221fa75059"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.119253 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:15 crc kubenswrapper[4776]: E1208 09:01:15.125210 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:15.625185368 +0000 UTC m=+151.888410390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.149688 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-srht8" event={"ID":"08ac05cf-c12d-4898-a9b0-451f11b44aed","Type":"ContainerStarted","Data":"756a602d4433b37fcf875769a598f2eb53a9e94c05cc0dfc269aaf71ec2a2ebb"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.170614 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" event={"ID":"e060d3c0-cba4-4930-a3af-76f7c3f5c9c1","Type":"ContainerStarted","Data":"2526099be0b98edc7bd91125ef5755a9c5852ad243db839b820503de0563f9d6"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.187849 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hzscb" event={"ID":"d519f527-d5bd-4e76-98de-4ff1e5720698","Type":"ContainerStarted","Data":"c87495d10886cb6d3b513e5851da9162eb9bfaaa9882d93194b3ccdf62ac2499"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.189916 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckhmx" event={"ID":"deb5eac5-1137-42a3-a5ad-e52a2d822cba","Type":"ContainerStarted","Data":"eb33b8e8210067e7b366a422c73b69420e87369c58265918d39387ff62912f86"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.189960 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckhmx" event={"ID":"deb5eac5-1137-42a3-a5ad-e52a2d822cba","Type":"ContainerStarted","Data":"51de1f3b2e17cd1ae8b34ed5c871a4a11cc0892b5b50f2be592f5abfac01aacc"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.189972 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckhmx" event={"ID":"deb5eac5-1137-42a3-a5ad-e52a2d822cba","Type":"ContainerStarted","Data":"5466286b6d9f65158fb4ae987c1f32f583e11d4b4d2a0b40e89bd69e7c917021"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.199234 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5" event={"ID":"cde69387-7c72-4285-a5ec-79f5626eeb96","Type":"ContainerStarted","Data":"85d779b8dd0f644984525b0e6a2a1129ce7706b253d9c13f80149447a5f81bc5"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.199286 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5" event={"ID":"cde69387-7c72-4285-a5ec-79f5626eeb96","Type":"ContainerStarted","Data":"3339c76b80dbe19d8d37ac360e2f85e17e3c3dad4fa5fcecadf67d37acacf237"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.199297 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5" event={"ID":"cde69387-7c72-4285-a5ec-79f5626eeb96","Type":"ContainerStarted","Data":"84337a6e563aa8d1608de9ca51b28747c0dacfe638f36c47b748a3f538f164ba"} Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.222033 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:15 crc kubenswrapper[4776]: E1208 09:01:15.229588 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:15.729572738 +0000 UTC m=+151.992797760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.331033 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:15 crc kubenswrapper[4776]: E1208 09:01:15.331629 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:15.831608847 +0000 UTC m=+152.094833879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.434318 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:15 crc kubenswrapper[4776]: E1208 09:01:15.434616 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:15.934605351 +0000 UTC m=+152.197830373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.539757 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:15 crc kubenswrapper[4776]: E1208 09:01:15.540006 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:16.039963767 +0000 UTC m=+152.303188779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.540067 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:15 crc kubenswrapper[4776]: E1208 09:01:15.540593 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:16.040581813 +0000 UTC m=+152.303806835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.641274 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:15 crc kubenswrapper[4776]: E1208 09:01:15.642039 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:16.142017706 +0000 UTC m=+152.405242728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.743273 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:15 crc kubenswrapper[4776]: E1208 09:01:15.743657 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:16.243639433 +0000 UTC m=+152.506864455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.827420 4776 patch_prober.go:28] interesting pod/router-default-5444994796-xh8d5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 09:01:15 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 08 09:01:15 crc kubenswrapper[4776]: [+]process-running ok Dec 08 09:01:15 crc kubenswrapper[4776]: healthz check failed Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.827481 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xh8d5" podUID="69d10b7f-1714-4536-800b-e7aa5bc7b73a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.844946 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:15 crc kubenswrapper[4776]: E1208 09:01:15.845251 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:16.34523579 +0000 UTC m=+152.608460802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.947498 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:15 crc kubenswrapper[4776]: E1208 09:01:15.948513 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:16.44846599 +0000 UTC m=+152.711691012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.993411 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:15 crc kubenswrapper[4776]: I1208 09:01:15.993845 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.012562 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.048597 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:16 crc kubenswrapper[4776]: E1208 09:01:16.048792 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:16.548760472 +0000 UTC m=+152.811985494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.048820 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-srht8" podStartSLOduration=133.048801873 podStartE2EDuration="2m13.048801873s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.04606671 +0000 UTC m=+152.309291732" watchObservedRunningTime="2025-12-08 09:01:16.048801873 +0000 UTC m=+152.312026895" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.078968 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckhmx" podStartSLOduration=133.078944933 podStartE2EDuration="2m13.078944933s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.078508351 +0000 UTC m=+152.341733373" watchObservedRunningTime="2025-12-08 09:01:16.078944933 +0000 UTC m=+152.342169955" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.118783 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6zp8l" podStartSLOduration=133.11876137 podStartE2EDuration="2m13.11876137s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.117451115 +0000 UTC m=+152.380676137" watchObservedRunningTime="2025-12-08 09:01:16.11876137 +0000 UTC m=+152.381986392" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.153259 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:16 crc kubenswrapper[4776]: E1208 09:01:16.153552 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:16.653539313 +0000 UTC m=+152.916764335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.221048 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5" podStartSLOduration=133.221022574 podStartE2EDuration="2m13.221022574s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.219150095 +0000 UTC m=+152.482375117" watchObservedRunningTime="2025-12-08 09:01:16.221022574 +0000 UTC m=+152.484247596" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.221828 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" podStartSLOduration=76.221823686 podStartE2EDuration="1m16.221823686s" podCreationTimestamp="2025-12-08 09:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.19752178 +0000 UTC m=+152.460746802" watchObservedRunningTime="2025-12-08 09:01:16.221823686 +0000 UTC m=+152.485048708" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.258509 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:16 crc kubenswrapper[4776]: E1208 09:01:16.258775 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:16.758758906 +0000 UTC m=+153.021983928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.306609 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2" event={"ID":"16448b50-7f70-4571-8150-a462b3774dfc","Type":"ContainerStarted","Data":"43b8576fd13ce4fd8ac2eac237be182adaa6a26a0444d45f2ff73bdac52ddaff"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.327340 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm" event={"ID":"b6825925-568b-416e-910f-52e1ee27d741","Type":"ContainerStarted","Data":"fb7ed9e5477c6f21bae9d593a66555b69f8ab36133eb8ccbc3f9bb5e46c4c9d5"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.327387 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm" event={"ID":"b6825925-568b-416e-910f-52e1ee27d741","Type":"ContainerStarted","Data":"aaadb052f196f714abd80d90dc1d8864ce00edddb13c85acf59f57f6ebf0fea4"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.362234 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:16 crc kubenswrapper[4776]: E1208 09:01:16.362568 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:16.862556631 +0000 UTC m=+153.125781653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.392784 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dzfmd" event={"ID":"e0478469-3de4-4f1e-8853-c5e4cdfedef0","Type":"ContainerStarted","Data":"6382b6d02953fd1fa854d18ceb0688306dd76f8728f91afbc49fad7cfc07aa2f"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.404384 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv" event={"ID":"2b292f39-b8ff-4187-ae0a-14928dc18e08","Type":"ContainerStarted","Data":"cdad39cd70606effcccb4b0726f3cbebebd250cf46d42f117d9999ce41a52acb"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.428018 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.428075 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.443314 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-njv2h" event={"ID":"35911247-ad00-422c-9d30-586834a80f76","Type":"ContainerStarted","Data":"80c7e9b1ec17c1e2301fc20ac66a3c0e9a5d3fb26610dea2f94294d5a1f43415"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.443758 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-njv2h" event={"ID":"35911247-ad00-422c-9d30-586834a80f76","Type":"ContainerStarted","Data":"22361d6d3ab1596e11f663d28a4e6d177e2a478d7e1e503ce34256125d650de1"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.462260 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-szjk8" event={"ID":"c736576f-aae9-4d51-a058-f4ad4d95edb4","Type":"ContainerStarted","Data":"3710c5091f212ea784125f9a59d50b8891dcc9d3a15629c8357e1869b2c06591"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.462838 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:16 crc kubenswrapper[4776]: E1208 09:01:16.463148 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:16.96312854 +0000 UTC m=+153.226353562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.476565 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g4tmm" podStartSLOduration=133.476549946 podStartE2EDuration="2m13.476549946s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.376580943 +0000 UTC m=+152.639805965" watchObservedRunningTime="2025-12-08 09:01:16.476549946 +0000 UTC m=+152.739774968" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.478258 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-njv2h" podStartSLOduration=133.478252291 podStartE2EDuration="2m13.478252291s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.475367425 +0000 UTC m=+152.738592447" watchObservedRunningTime="2025-12-08 09:01:16.478252291 +0000 UTC m=+152.741477313" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.479276 4776 patch_prober.go:28] interesting pod/apiserver-76f77b778f-bk9qw container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 08 09:01:16 crc kubenswrapper[4776]: [+]log ok Dec 08 09:01:16 crc kubenswrapper[4776]: [+]etcd ok Dec 08 09:01:16 crc kubenswrapper[4776]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 08 09:01:16 crc kubenswrapper[4776]: [+]poststarthook/generic-apiserver-start-informers ok Dec 08 09:01:16 crc kubenswrapper[4776]: [+]poststarthook/max-in-flight-filter ok Dec 08 09:01:16 crc kubenswrapper[4776]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 08 09:01:16 crc kubenswrapper[4776]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 08 09:01:16 crc kubenswrapper[4776]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 08 09:01:16 crc kubenswrapper[4776]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 08 09:01:16 crc kubenswrapper[4776]: [+]poststarthook/project.openshift.io-projectcache ok Dec 08 09:01:16 crc kubenswrapper[4776]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 08 09:01:16 crc kubenswrapper[4776]: [+]poststarthook/openshift.io-startinformers ok Dec 08 09:01:16 crc kubenswrapper[4776]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 08 09:01:16 crc kubenswrapper[4776]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 08 09:01:16 crc kubenswrapper[4776]: livez check failed Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.479339 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" podUID="c0bf1894-515b-4ae6-bcf5-148f5db59022" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.490697 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hzscb" event={"ID":"d519f527-d5bd-4e76-98de-4ff1e5720698","Type":"ContainerStarted","Data":"3dd75538eff46158dcff0b36dabf72fd0f2ad752b05687be20ab611aa117bd1d"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.516041 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b59h4" event={"ID":"5c4d8f0e-5e42-4fd1-8c31-2cd3ffd7eeda","Type":"ContainerStarted","Data":"d4d3ed2e07372fa502402a2a0aa995c2f6f861247501ea72f48df17e0cf3d5dd"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.524906 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hzscb" podStartSLOduration=132.524884479 podStartE2EDuration="2m12.524884479s" podCreationTimestamp="2025-12-08 08:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.524028556 +0000 UTC m=+152.787253578" watchObservedRunningTime="2025-12-08 09:01:16.524884479 +0000 UTC m=+152.788109501" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.528548 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng" event={"ID":"6c4c5b58-d542-4ab5-8132-586b180392a0","Type":"ContainerStarted","Data":"667a9bc999bf7b344dcbf7b8a47fa9e2197e97fd0ae83045c02aecf4c0e3f080"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.528592 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng" event={"ID":"6c4c5b58-d542-4ab5-8132-586b180392a0","Type":"ContainerStarted","Data":"ca31d25295e560fc8449e2e6a299cf53b71c842d7086d2dcf196f47e5b1b6706"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.543419 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" event={"ID":"13284ed8-fd88-4a11-81e5-b58cf9551c27","Type":"ContainerStarted","Data":"0e36f5c8a0193f696090e8f060559f9a81c1996bdd38e3b40d6f3866ae55921d"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.543463 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" event={"ID":"13284ed8-fd88-4a11-81e5-b58cf9551c27","Type":"ContainerStarted","Data":"2299d25f9b4756c9c85ab7cd4795d1c6eb25772e59bad783c96da4459f7a8b40"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.561745 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" event={"ID":"c1b1a4bb-5729-4495-a10f-d5aa62f2c502","Type":"ContainerStarted","Data":"44324b514ab78d5d7947e0c277c5b4f144e7b37056ae812474faf0cdfb436c0e"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.562654 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.563032 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b59h4" podStartSLOduration=8.563010451 podStartE2EDuration="8.563010451s" podCreationTimestamp="2025-12-08 09:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.556617971 +0000 UTC m=+152.819843003" watchObservedRunningTime="2025-12-08 09:01:16.563010451 +0000 UTC m=+152.826235473" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.566531 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:16 crc kubenswrapper[4776]: E1208 09:01:16.566833 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:17.066823373 +0000 UTC m=+153.330048395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.571398 4776 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-t55fv container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.571443 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" podUID="c1b1a4bb-5729-4495-a10f-d5aa62f2c502" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.571674 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" event={"ID":"50e1cbc5-727f-42ca-881c-fdd0b07ca739","Type":"ContainerStarted","Data":"726cc14fd07cdccd06b06e7db795eb13bf7b9ce8069a7c9f2f0bcdefd1c5a77c"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.572504 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.588523 4776 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jf2nl container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.588572 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" podUID="50e1cbc5-727f-42ca-881c-fdd0b07ca739" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.592565 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5gpv8" event={"ID":"4c9dc880-5c96-4a70-baa5-f4628a9a19be","Type":"ContainerStarted","Data":"eba061578176c276fc8e01e7fd8cf6d923e75a142648fdf29e9332e83cd13554"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.592640 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5gpv8" event={"ID":"4c9dc880-5c96-4a70-baa5-f4628a9a19be","Type":"ContainerStarted","Data":"3282ba25368c556e3e113f969c6c3b334479a94d4b9e8ba780fccfa53a528574"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.605624 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" event={"ID":"e060d3c0-cba4-4930-a3af-76f7c3f5c9c1","Type":"ContainerStarted","Data":"3213d91a2d322a9340e15d886d550bbc2185704feb9182f79000f1d31b0acb40"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.606578 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.609896 4776 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-b9c44 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.609935 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" podUID="e060d3c0-cba4-4930-a3af-76f7c3f5c9c1" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.610645 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" event={"ID":"5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45","Type":"ContainerStarted","Data":"eab96145944fbda41893569fc0fbf7dee60eac774a9af01d7b211c24e0bb21ee"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.612638 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" event={"ID":"9213c07d-c865-4fde-b2cd-f28d46033e74","Type":"ContainerStarted","Data":"034410b2a8db1bf52f2cd2b3d5c33a2adb68e8b604d9be7ff20034a79076ef97"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.613806 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.628002 4776 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-znqkr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:5443/healthz\": dial tcp 10.217.0.26:5443: connect: connection refused" start-of-body= Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.628057 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" podUID="9213c07d-c865-4fde-b2cd-f28d46033e74" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.26:5443/healthz\": dial tcp 10.217.0.26:5443: connect: connection refused" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.633996 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rtrng" podStartSLOduration=133.633981595 podStartE2EDuration="2m13.633981595s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.632635829 +0000 UTC m=+152.895860851" watchObservedRunningTime="2025-12-08 09:01:16.633981595 +0000 UTC m=+152.897206617" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.636687 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w" event={"ID":"929dd3e5-a329-4291-8fd1-c998483026cc","Type":"ContainerStarted","Data":"bba4ee2fef1b809995281060c1a446451fcfa5e5b08c2ed4bf3520ff747e206f"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.667713 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.667829 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v" event={"ID":"10462781-68cf-4a10-b7c7-b9700465d964","Type":"ContainerStarted","Data":"017d26059362967fee3fbd588a5cb2c6a96ed9156bc0eace77d0a108c97c3376"} Dec 08 09:01:16 crc kubenswrapper[4776]: E1208 09:01:16.668491 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:17.1684514 +0000 UTC m=+153.431676422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.668792 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:16 crc kubenswrapper[4776]: E1208 09:01:16.669254 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:17.16923254 +0000 UTC m=+153.432457562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.675262 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" event={"ID":"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9","Type":"ContainerStarted","Data":"30181c9da4710026d7271cc8f86bc6263b3402c2febf65ce7b53b4f72c3d9e64"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.683365 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" podStartSLOduration=133.683331294 podStartE2EDuration="2m13.683331294s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.675944709 +0000 UTC m=+152.939169741" watchObservedRunningTime="2025-12-08 09:01:16.683331294 +0000 UTC m=+152.946556316" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.693409 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" event={"ID":"1a02f65b-9fb4-41a5-974a-648ba0e107eb","Type":"ContainerStarted","Data":"5c7ac8dafdb93d043002153f08b3deb73c538cc680c6bed0c21837b4697c9968"} Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.694941 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.702696 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lmfgv" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.703744 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" podStartSLOduration=133.703719066 podStartE2EDuration="2m13.703719066s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.695430416 +0000 UTC m=+152.958655438" watchObservedRunningTime="2025-12-08 09:01:16.703719066 +0000 UTC m=+152.966944088" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.733830 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" podStartSLOduration=133.733805684 podStartE2EDuration="2m13.733805684s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.711155003 +0000 UTC m=+152.974380025" watchObservedRunningTime="2025-12-08 09:01:16.733805684 +0000 UTC m=+152.997030706" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.770094 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:16 crc kubenswrapper[4776]: E1208 09:01:16.771263 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:17.271244489 +0000 UTC m=+153.534469511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.775324 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" podStartSLOduration=133.775302306 podStartE2EDuration="2m13.775302306s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.753738333 +0000 UTC m=+153.016963355" watchObservedRunningTime="2025-12-08 09:01:16.775302306 +0000 UTC m=+153.038527328" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.801917 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5gpv8" podStartSLOduration=132.801886932 podStartE2EDuration="2m12.801886932s" podCreationTimestamp="2025-12-08 08:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.779748404 +0000 UTC m=+153.042973426" watchObservedRunningTime="2025-12-08 09:01:16.801886932 +0000 UTC m=+153.065111954" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.815398 4776 patch_prober.go:28] interesting pod/router-default-5444994796-xh8d5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 09:01:16 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 08 09:01:16 crc kubenswrapper[4776]: [+]process-running ok Dec 08 09:01:16 crc kubenswrapper[4776]: healthz check failed Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.815467 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xh8d5" podUID="69d10b7f-1714-4536-800b-e7aa5bc7b73a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.848103 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-txnxn" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.869975 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v" podStartSLOduration=133.869955698 podStartE2EDuration="2m13.869955698s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.834386534 +0000 UTC m=+153.097611576" watchObservedRunningTime="2025-12-08 09:01:16.869955698 +0000 UTC m=+153.133180720" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.880583 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:16 crc kubenswrapper[4776]: E1208 09:01:16.891447 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:17.391427849 +0000 UTC m=+153.654652871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.892009 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.956117 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-mbv9b" podStartSLOduration=133.956097875 podStartE2EDuration="2m13.956097875s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:16.871038857 +0000 UTC m=+153.134263879" watchObservedRunningTime="2025-12-08 09:01:16.956097875 +0000 UTC m=+153.219322897" Dec 08 09:01:16 crc kubenswrapper[4776]: I1208 09:01:16.982752 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:16 crc kubenswrapper[4776]: E1208 09:01:16.983865 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:17.483847192 +0000 UTC m=+153.747072204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.090539 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:17 crc kubenswrapper[4776]: E1208 09:01:17.091025 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:17.591010056 +0000 UTC m=+153.854235068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.191466 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:17 crc kubenswrapper[4776]: E1208 09:01:17.191730 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:17.691716499 +0000 UTC m=+153.954941521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.292596 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:17 crc kubenswrapper[4776]: E1208 09:01:17.292905 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:17.792894254 +0000 UTC m=+154.056119276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.400890 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:17 crc kubenswrapper[4776]: E1208 09:01:17.401786 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:17.901764484 +0000 UTC m=+154.164989496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.502672 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:17 crc kubenswrapper[4776]: E1208 09:01:17.503017 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:18.003001181 +0000 UTC m=+154.266226203 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.603638 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:17 crc kubenswrapper[4776]: E1208 09:01:17.603990 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:18.10394035 +0000 UTC m=+154.367165382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.702572 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" event={"ID":"5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45","Type":"ContainerStarted","Data":"7a562de244409e98254ea3c9fd9b1e5443d4cc02da0752bdf212676ee2c4cc57"} Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.702664 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" event={"ID":"5cb3c530-6b9c-4e5e-9e1c-bfa8281b6c45","Type":"ContainerStarted","Data":"a7c0b9947217d319b571a9a2d558f2f874a7f96dff4a014aa58b7ab6e82f38c8"} Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.704639 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w" event={"ID":"929dd3e5-a329-4291-8fd1-c998483026cc","Type":"ContainerStarted","Data":"c89f58ad06d0c5302a6b085e4b090d0be5c05b03918bfe06a1f932baa0e410c9"} Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.704820 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:17 crc kubenswrapper[4776]: E1208 09:01:17.705149 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:18.205132186 +0000 UTC m=+154.468357208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.705873 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2jxng" event={"ID":"39c1c6c8-3cb7-44eb-b0cc-f218e4dfe724","Type":"ContainerStarted","Data":"073d34be4bc1c1f32c09a4ff4de4e1e8446930d292815efb890901e327b12857"} Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.705904 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2jxng" event={"ID":"39c1c6c8-3cb7-44eb-b0cc-f218e4dfe724","Type":"ContainerStarted","Data":"e21ed6e4e0e3b676f59ce885b7a26e5de196a123816fb60fea02fb9edd93df4c"} Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.707122 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dzfmd" event={"ID":"e0478469-3de4-4f1e-8853-c5e4cdfedef0","Type":"ContainerStarted","Data":"7197a49c7613b364000fb73fb9831674667b6e8ed23fbee731f70b380d466b45"} Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.707232 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-dzfmd" Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.708114 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" event={"ID":"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9","Type":"ContainerStarted","Data":"b1420741202b59923b16c72171762e84de8d6d349c88ef2b3e35a8d1028ed9cb"} Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.709197 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-szjk8" event={"ID":"c736576f-aae9-4d51-a058-f4ad4d95edb4","Type":"ContainerStarted","Data":"57b25111209c7ec2996032bc23d7def44f863e1c32b15d6447dd16eeae18012f"} Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.709221 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-szjk8" event={"ID":"c736576f-aae9-4d51-a058-f4ad4d95edb4","Type":"ContainerStarted","Data":"74240072531dd9d3b3e5509990959bcfb41281207681d64cb0e989c00510bde0"} Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.710504 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv" event={"ID":"2b292f39-b8ff-4187-ae0a-14928dc18e08","Type":"ContainerStarted","Data":"311d8a7091e6aaa82edc2973e302737878cfe34d32d92a1da49befd166fc3e25"} Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.711857 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2" event={"ID":"16448b50-7f70-4571-8150-a462b3774dfc","Type":"ContainerStarted","Data":"6f8c24e74fcb7905cdd09dbd72980ec6f20bcfc666b1364f902e8d60ef7d6260"} Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.714720 4776 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jf2nl container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.714768 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" podUID="50e1cbc5-727f-42ca-881c-fdd0b07ca739" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.715156 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" event={"ID":"13284ed8-fd88-4a11-81e5-b58cf9551c27","Type":"ContainerStarted","Data":"7f0100348ed292b7df9937007549f6a79f77d32fd1d8e08192b46bd74c9997ad"} Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.721345 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9c44" Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.726905 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t55fv" Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.806166 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:17 crc kubenswrapper[4776]: E1208 09:01:17.806266 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:18.30624773 +0000 UTC m=+154.569472752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.807259 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.817100 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-94rjd" podStartSLOduration=134.817079427 podStartE2EDuration="2m14.817079427s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:17.762131728 +0000 UTC m=+154.025356750" watchObservedRunningTime="2025-12-08 09:01:17.817079427 +0000 UTC m=+154.080304449" Dec 08 09:01:17 crc kubenswrapper[4776]: E1208 09:01:17.817546 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:18.317520699 +0000 UTC m=+154.580745721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.830745 4776 patch_prober.go:28] interesting pod/router-default-5444994796-xh8d5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 09:01:17 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 08 09:01:17 crc kubenswrapper[4776]: [+]process-running ok Dec 08 09:01:17 crc kubenswrapper[4776]: healthz check failed Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.830844 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xh8d5" podUID="69d10b7f-1714-4536-800b-e7aa5bc7b73a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.845040 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-2jxng" podStartSLOduration=134.845015929 podStartE2EDuration="2m14.845015929s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:17.828944262 +0000 UTC m=+154.092169274" watchObservedRunningTime="2025-12-08 09:01:17.845015929 +0000 UTC m=+154.108240951" Dec 08 09:01:17 crc kubenswrapper[4776]: I1208 09:01:17.910290 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:17 crc kubenswrapper[4776]: E1208 09:01:17.910785 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:18.410765083 +0000 UTC m=+154.673990105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.000102 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-252w2" podStartSLOduration=135.000085704 podStartE2EDuration="2m15.000085704s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:17.943198275 +0000 UTC m=+154.206423297" watchObservedRunningTime="2025-12-08 09:01:18.000085704 +0000 UTC m=+154.263310736" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.004416 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ds8zv" podStartSLOduration=135.004395979 podStartE2EDuration="2m15.004395979s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:17.999109079 +0000 UTC m=+154.262334101" watchObservedRunningTime="2025-12-08 09:01:18.004395979 +0000 UTC m=+154.267621001" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.014218 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:18 crc kubenswrapper[4776]: E1208 09:01:18.014685 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:18.514656162 +0000 UTC m=+154.777881184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.029148 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-znqkr" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.040254 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-szjk8" podStartSLOduration=135.04022739 podStartE2EDuration="2m15.04022739s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:18.04021934 +0000 UTC m=+154.303444362" watchObservedRunningTime="2025-12-08 09:01:18.04022739 +0000 UTC m=+154.303452412" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.073815 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dzfmd" podStartSLOduration=10.073790411 podStartE2EDuration="10.073790411s" podCreationTimestamp="2025-12-08 09:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:18.071416338 +0000 UTC m=+154.334641360" watchObservedRunningTime="2025-12-08 09:01:18.073790411 +0000 UTC m=+154.337015433" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.107377 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tf65w" podStartSLOduration=135.107355282 podStartE2EDuration="2m15.107355282s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:18.103799077 +0000 UTC m=+154.367024099" watchObservedRunningTime="2025-12-08 09:01:18.107355282 +0000 UTC m=+154.370580304" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.116106 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:18 crc kubenswrapper[4776]: E1208 09:01:18.116402 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:18.616383712 +0000 UTC m=+154.879608724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.116464 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:18 crc kubenswrapper[4776]: E1208 09:01:18.116765 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:18.616758212 +0000 UTC m=+154.879983234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.164548 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwbkt" podStartSLOduration=135.164527799 podStartE2EDuration="2m15.164527799s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:18.141589011 +0000 UTC m=+154.404814033" watchObservedRunningTime="2025-12-08 09:01:18.164527799 +0000 UTC m=+154.427752821" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.218866 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:18 crc kubenswrapper[4776]: E1208 09:01:18.221166 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:18.721143922 +0000 UTC m=+154.984368944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.320867 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:18 crc kubenswrapper[4776]: E1208 09:01:18.321275 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:18.821252899 +0000 UTC m=+155.084477931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.425141 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:18 crc kubenswrapper[4776]: E1208 09:01:18.425437 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 09:01:18.925406334 +0000 UTC m=+155.188631356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.425673 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:18 crc kubenswrapper[4776]: E1208 09:01:18.426248 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 09:01:18.926206254 +0000 UTC m=+155.189431276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5sqv" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.489817 4776 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.503779 4776 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-08T09:01:18.489844054Z","Handler":null,"Name":""} Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.517140 4776 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.517194 4776 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.526311 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.583952 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.627929 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.672865 4776 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.672924 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.676096 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z2qsw"] Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.677124 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.679382 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.688885 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z2qsw"] Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.722133 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" event={"ID":"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9","Type":"ContainerStarted","Data":"6c0d12f6ca1fad74ec7e5f25e09e0b86bf9f7ab9cd835a9c88d302c360545e22"} Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.722272 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" event={"ID":"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9","Type":"ContainerStarted","Data":"657b66eb95b512792623f71b57a393e6c57ca7b0cbf326a9d2f87defce92dca1"} Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.729638 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.811465 4776 patch_prober.go:28] interesting pod/router-default-5444994796-xh8d5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 09:01:18 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 08 09:01:18 crc kubenswrapper[4776]: [+]process-running ok Dec 08 09:01:18 crc kubenswrapper[4776]: healthz check failed Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.811533 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xh8d5" podUID="69d10b7f-1714-4536-800b-e7aa5bc7b73a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.833439 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj4sg\" (UniqueName: \"kubernetes.io/projected/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-kube-api-access-mj4sg\") pod \"community-operators-z2qsw\" (UID: \"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac\") " pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.834384 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-utilities\") pod \"community-operators-z2qsw\" (UID: \"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac\") " pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.834874 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-catalog-content\") pod \"community-operators-z2qsw\" (UID: \"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac\") " pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.859590 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5sqv\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.878772 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-swlvb"] Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.880119 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.885382 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.886765 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-swlvb"] Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.895936 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.938226 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a339c1-f955-4a08-bba1-0df39a886324-catalog-content\") pod \"certified-operators-swlvb\" (UID: \"b6a339c1-f955-4a08-bba1-0df39a886324\") " pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.938281 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-catalog-content\") pod \"community-operators-z2qsw\" (UID: \"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac\") " pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.938340 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqkrn\" (UniqueName: \"kubernetes.io/projected/b6a339c1-f955-4a08-bba1-0df39a886324-kube-api-access-kqkrn\") pod \"certified-operators-swlvb\" (UID: \"b6a339c1-f955-4a08-bba1-0df39a886324\") " pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.938364 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj4sg\" (UniqueName: \"kubernetes.io/projected/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-kube-api-access-mj4sg\") pod \"community-operators-z2qsw\" (UID: \"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac\") " pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.938383 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a339c1-f955-4a08-bba1-0df39a886324-utilities\") pod \"certified-operators-swlvb\" (UID: \"b6a339c1-f955-4a08-bba1-0df39a886324\") " pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.938407 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-utilities\") pod \"community-operators-z2qsw\" (UID: \"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac\") " pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.938809 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-catalog-content\") pod \"community-operators-z2qsw\" (UID: \"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac\") " pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.938903 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-utilities\") pod \"community-operators-z2qsw\" (UID: \"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac\") " pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.972143 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj4sg\" (UniqueName: \"kubernetes.io/projected/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-kube-api-access-mj4sg\") pod \"community-operators-z2qsw\" (UID: \"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac\") " pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:01:18 crc kubenswrapper[4776]: I1208 09:01:18.990470 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.045559 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqkrn\" (UniqueName: \"kubernetes.io/projected/b6a339c1-f955-4a08-bba1-0df39a886324-kube-api-access-kqkrn\") pod \"certified-operators-swlvb\" (UID: \"b6a339c1-f955-4a08-bba1-0df39a886324\") " pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.045939 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a339c1-f955-4a08-bba1-0df39a886324-utilities\") pod \"certified-operators-swlvb\" (UID: \"b6a339c1-f955-4a08-bba1-0df39a886324\") " pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.045986 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a339c1-f955-4a08-bba1-0df39a886324-catalog-content\") pod \"certified-operators-swlvb\" (UID: \"b6a339c1-f955-4a08-bba1-0df39a886324\") " pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.046731 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a339c1-f955-4a08-bba1-0df39a886324-catalog-content\") pod \"certified-operators-swlvb\" (UID: \"b6a339c1-f955-4a08-bba1-0df39a886324\") " pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.047280 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a339c1-f955-4a08-bba1-0df39a886324-utilities\") pod \"certified-operators-swlvb\" (UID: \"b6a339c1-f955-4a08-bba1-0df39a886324\") " pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.064898 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7th2w"] Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.069262 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.081336 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqkrn\" (UniqueName: \"kubernetes.io/projected/b6a339c1-f955-4a08-bba1-0df39a886324-kube-api-access-kqkrn\") pod \"certified-operators-swlvb\" (UID: \"b6a339c1-f955-4a08-bba1-0df39a886324\") " pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.082441 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7th2w"] Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.147127 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-utilities\") pod \"community-operators-7th2w\" (UID: \"ed9b52d1-9f5b-4d4b-aece-23ced00e6737\") " pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.147206 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xntmr\" (UniqueName: \"kubernetes.io/projected/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-kube-api-access-xntmr\") pod \"community-operators-7th2w\" (UID: \"ed9b52d1-9f5b-4d4b-aece-23ced00e6737\") " pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.147235 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-catalog-content\") pod \"community-operators-7th2w\" (UID: \"ed9b52d1-9f5b-4d4b-aece-23ced00e6737\") " pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.209868 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.251568 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-catalog-content\") pod \"community-operators-7th2w\" (UID: \"ed9b52d1-9f5b-4d4b-aece-23ced00e6737\") " pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.251685 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-utilities\") pod \"community-operators-7th2w\" (UID: \"ed9b52d1-9f5b-4d4b-aece-23ced00e6737\") " pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.251742 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xntmr\" (UniqueName: \"kubernetes.io/projected/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-kube-api-access-xntmr\") pod \"community-operators-7th2w\" (UID: \"ed9b52d1-9f5b-4d4b-aece-23ced00e6737\") " pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.252527 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-catalog-content\") pod \"community-operators-7th2w\" (UID: \"ed9b52d1-9f5b-4d4b-aece-23ced00e6737\") " pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.252797 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-utilities\") pod \"community-operators-7th2w\" (UID: \"ed9b52d1-9f5b-4d4b-aece-23ced00e6737\") " pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.277488 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8kcmf"] Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.278717 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.279919 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xntmr\" (UniqueName: \"kubernetes.io/projected/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-kube-api-access-xntmr\") pod \"community-operators-7th2w\" (UID: \"ed9b52d1-9f5b-4d4b-aece-23ced00e6737\") " pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.287407 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8kcmf"] Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.294626 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z2qsw"] Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.352475 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1122c2d9-2ef0-4527-be3f-5617003d2bc0-catalog-content\") pod \"certified-operators-8kcmf\" (UID: \"1122c2d9-2ef0-4527-be3f-5617003d2bc0\") " pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.352535 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v44b\" (UniqueName: \"kubernetes.io/projected/1122c2d9-2ef0-4527-be3f-5617003d2bc0-kube-api-access-5v44b\") pod \"certified-operators-8kcmf\" (UID: \"1122c2d9-2ef0-4527-be3f-5617003d2bc0\") " pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.352563 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1122c2d9-2ef0-4527-be3f-5617003d2bc0-utilities\") pod \"certified-operators-8kcmf\" (UID: \"1122c2d9-2ef0-4527-be3f-5617003d2bc0\") " pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.391951 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:01:19 crc kubenswrapper[4776]: W1208 09:01:19.418350 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f1cf0fc_eed0_4fba_8b89_b29bf78cadac.slice/crio-360703b853412245904335459f733e558000226ff4569205758fc692bbdcc6c3 WatchSource:0}: Error finding container 360703b853412245904335459f733e558000226ff4569205758fc692bbdcc6c3: Status 404 returned error can't find the container with id 360703b853412245904335459f733e558000226ff4569205758fc692bbdcc6c3 Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.436688 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p5sqv"] Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.457985 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1122c2d9-2ef0-4527-be3f-5617003d2bc0-catalog-content\") pod \"certified-operators-8kcmf\" (UID: \"1122c2d9-2ef0-4527-be3f-5617003d2bc0\") " pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.458061 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v44b\" (UniqueName: \"kubernetes.io/projected/1122c2d9-2ef0-4527-be3f-5617003d2bc0-kube-api-access-5v44b\") pod \"certified-operators-8kcmf\" (UID: \"1122c2d9-2ef0-4527-be3f-5617003d2bc0\") " pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.458095 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1122c2d9-2ef0-4527-be3f-5617003d2bc0-utilities\") pod \"certified-operators-8kcmf\" (UID: \"1122c2d9-2ef0-4527-be3f-5617003d2bc0\") " pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.458724 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1122c2d9-2ef0-4527-be3f-5617003d2bc0-utilities\") pod \"certified-operators-8kcmf\" (UID: \"1122c2d9-2ef0-4527-be3f-5617003d2bc0\") " pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.458975 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1122c2d9-2ef0-4527-be3f-5617003d2bc0-catalog-content\") pod \"certified-operators-8kcmf\" (UID: \"1122c2d9-2ef0-4527-be3f-5617003d2bc0\") " pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.506736 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v44b\" (UniqueName: \"kubernetes.io/projected/1122c2d9-2ef0-4527-be3f-5617003d2bc0-kube-api-access-5v44b\") pod \"certified-operators-8kcmf\" (UID: \"1122c2d9-2ef0-4527-be3f-5617003d2bc0\") " pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.609593 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.635367 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-swlvb"] Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.767499 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" event={"ID":"04b6f5ef-a04c-47c5-b35d-700ed59c5ac9","Type":"ContainerStarted","Data":"0920e0573bd1bbea8e5e4ff804c2f2633c159f50d8a6c6b9138b05b2636d77f7"} Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.783484 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" event={"ID":"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57","Type":"ContainerStarted","Data":"462f722382159c5e98a210082c413621f3698f47b0506247320635e57459a17a"} Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.785467 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2qsw" event={"ID":"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac","Type":"ContainerStarted","Data":"360703b853412245904335459f733e558000226ff4569205758fc692bbdcc6c3"} Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.786344 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swlvb" event={"ID":"b6a339c1-f955-4a08-bba1-0df39a886324","Type":"ContainerStarted","Data":"66e39df1ffbfbfdd34b59edf64ae391cf2aab8f8cd7f272946f40878b8617774"} Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.788588 4776 generic.go:334] "Generic (PLEG): container finished" podID="06cf2358-4cba-4d69-81d1-dc02434fe460" containerID="14510ab17084a5d159b27749b9a50aff2bde80a03fc6bf1ac4cb8f9804f42f25" exitCode=0 Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.789141 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" event={"ID":"06cf2358-4cba-4d69-81d1-dc02434fe460","Type":"ContainerDied","Data":"14510ab17084a5d159b27749b9a50aff2bde80a03fc6bf1ac4cb8f9804f42f25"} Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.799931 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-b2vfk" podStartSLOduration=11.799903316 podStartE2EDuration="11.799903316s" podCreationTimestamp="2025-12-08 09:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:19.799015922 +0000 UTC m=+156.062240944" watchObservedRunningTime="2025-12-08 09:01:19.799903316 +0000 UTC m=+156.063128338" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.824345 4776 patch_prober.go:28] interesting pod/router-default-5444994796-xh8d5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 09:01:19 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 08 09:01:19 crc kubenswrapper[4776]: [+]process-running ok Dec 08 09:01:19 crc kubenswrapper[4776]: healthz check failed Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.824398 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xh8d5" podUID="69d10b7f-1714-4536-800b-e7aa5bc7b73a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.856629 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7th2w"] Dec 08 09:01:19 crc kubenswrapper[4776]: I1208 09:01:19.913090 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8kcmf"] Dec 08 09:01:19 crc kubenswrapper[4776]: W1208 09:01:19.959356 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1122c2d9_2ef0_4527_be3f_5617003d2bc0.slice/crio-b03d54ba0075762765111878e99480832f463d0d232529616791d8aee2c46c44 WatchSource:0}: Error finding container b03d54ba0075762765111878e99480832f463d0d232529616791d8aee2c46c44: Status 404 returned error can't find the container with id b03d54ba0075762765111878e99480832f463d0d232529616791d8aee2c46c44 Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.353279 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.741296 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.741351 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.743603 4776 patch_prober.go:28] interesting pod/console-f9d7485db-8dm9l container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.743671 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8dm9l" podUID="b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.794518 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kcmf" event={"ID":"1122c2d9-2ef0-4527-be3f-5617003d2bc0","Type":"ContainerStarted","Data":"b03d54ba0075762765111878e99480832f463d0d232529616791d8aee2c46c44"} Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.796286 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7th2w" event={"ID":"ed9b52d1-9f5b-4d4b-aece-23ced00e6737","Type":"ContainerStarted","Data":"daafb80f6925fef37cd042bceab0c4d5c14e1f27dab755f1738347ee179b1dd8"} Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.804707 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-559sf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.804785 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-559sf" podUID="d6d2a3a0-669f-41c3-8a04-1a4f7f961f1b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.804731 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-559sf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.804874 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-559sf" podUID="d6d2a3a0-669f-41c3-8a04-1a4f7f961f1b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.811395 4776 patch_prober.go:28] interesting pod/router-default-5444994796-xh8d5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 09:01:20 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 08 09:01:20 crc kubenswrapper[4776]: [+]process-running ok Dec 08 09:01:20 crc kubenswrapper[4776]: healthz check failed Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.811458 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xh8d5" podUID="69d10b7f-1714-4536-800b-e7aa5bc7b73a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.864750 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-82j27"] Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.866294 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.868476 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.881349 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-82j27"] Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.983625 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-catalog-content\") pod \"redhat-marketplace-82j27\" (UID: \"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9\") " pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.984302 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqkcs\" (UniqueName: \"kubernetes.io/projected/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-kube-api-access-fqkcs\") pod \"redhat-marketplace-82j27\" (UID: \"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9\") " pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:01:20 crc kubenswrapper[4776]: I1208 09:01:20.984332 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-utilities\") pod \"redhat-marketplace-82j27\" (UID: \"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9\") " pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.085403 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-catalog-content\") pod \"redhat-marketplace-82j27\" (UID: \"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9\") " pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.085477 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqkcs\" (UniqueName: \"kubernetes.io/projected/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-kube-api-access-fqkcs\") pod \"redhat-marketplace-82j27\" (UID: \"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9\") " pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.085502 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-utilities\") pod \"redhat-marketplace-82j27\" (UID: \"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9\") " pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.085941 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-utilities\") pod \"redhat-marketplace-82j27\" (UID: \"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9\") " pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.086187 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-catalog-content\") pod \"redhat-marketplace-82j27\" (UID: \"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9\") " pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.089267 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.115323 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqkcs\" (UniqueName: \"kubernetes.io/projected/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-kube-api-access-fqkcs\") pod \"redhat-marketplace-82j27\" (UID: \"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9\") " pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.186538 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.261666 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ghlj2"] Dec 08 09:01:21 crc kubenswrapper[4776]: E1208 09:01:21.261851 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06cf2358-4cba-4d69-81d1-dc02434fe460" containerName="collect-profiles" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.261862 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="06cf2358-4cba-4d69-81d1-dc02434fe460" containerName="collect-profiles" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.261959 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="06cf2358-4cba-4d69-81d1-dc02434fe460" containerName="collect-profiles" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.262888 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.282125 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghlj2"] Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.287039 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06cf2358-4cba-4d69-81d1-dc02434fe460-secret-volume\") pod \"06cf2358-4cba-4d69-81d1-dc02434fe460\" (UID: \"06cf2358-4cba-4d69-81d1-dc02434fe460\") " Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.287077 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rg8m\" (UniqueName: \"kubernetes.io/projected/06cf2358-4cba-4d69-81d1-dc02434fe460-kube-api-access-6rg8m\") pod \"06cf2358-4cba-4d69-81d1-dc02434fe460\" (UID: \"06cf2358-4cba-4d69-81d1-dc02434fe460\") " Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.287107 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06cf2358-4cba-4d69-81d1-dc02434fe460-config-volume\") pod \"06cf2358-4cba-4d69-81d1-dc02434fe460\" (UID: \"06cf2358-4cba-4d69-81d1-dc02434fe460\") " Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.288149 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06cf2358-4cba-4d69-81d1-dc02434fe460-config-volume" (OuterVolumeSpecName: "config-volume") pod "06cf2358-4cba-4d69-81d1-dc02434fe460" (UID: "06cf2358-4cba-4d69-81d1-dc02434fe460"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.297509 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06cf2358-4cba-4d69-81d1-dc02434fe460-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "06cf2358-4cba-4d69-81d1-dc02434fe460" (UID: "06cf2358-4cba-4d69-81d1-dc02434fe460"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.299458 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06cf2358-4cba-4d69-81d1-dc02434fe460-kube-api-access-6rg8m" (OuterVolumeSpecName: "kube-api-access-6rg8m") pod "06cf2358-4cba-4d69-81d1-dc02434fe460" (UID: "06cf2358-4cba-4d69-81d1-dc02434fe460"). InnerVolumeSpecName "kube-api-access-6rg8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.372934 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-82j27"] Dec 08 09:01:21 crc kubenswrapper[4776]: W1208 09:01:21.380499 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f76ec41_a3f1_4fdc_96d1_1a1b2dd5f4f9.slice/crio-71b68a8eb858fd1b3e75bef1902efb78815ffec85a19de9d56cb17c0050ed8dd WatchSource:0}: Error finding container 71b68a8eb858fd1b3e75bef1902efb78815ffec85a19de9d56cb17c0050ed8dd: Status 404 returned error can't find the container with id 71b68a8eb858fd1b3e75bef1902efb78815ffec85a19de9d56cb17c0050ed8dd Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.388688 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddw2s\" (UniqueName: \"kubernetes.io/projected/58f8ec13-d004-4d1f-8a44-f325c07c25fd-kube-api-access-ddw2s\") pod \"redhat-marketplace-ghlj2\" (UID: \"58f8ec13-d004-4d1f-8a44-f325c07c25fd\") " pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.388725 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f8ec13-d004-4d1f-8a44-f325c07c25fd-utilities\") pod \"redhat-marketplace-ghlj2\" (UID: \"58f8ec13-d004-4d1f-8a44-f325c07c25fd\") " pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.388789 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f8ec13-d004-4d1f-8a44-f325c07c25fd-catalog-content\") pod \"redhat-marketplace-ghlj2\" (UID: \"58f8ec13-d004-4d1f-8a44-f325c07c25fd\") " pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.388875 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06cf2358-4cba-4d69-81d1-dc02434fe460-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.388894 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rg8m\" (UniqueName: \"kubernetes.io/projected/06cf2358-4cba-4d69-81d1-dc02434fe460-kube-api-access-6rg8m\") on node \"crc\" DevicePath \"\"" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.388903 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06cf2358-4cba-4d69-81d1-dc02434fe460-config-volume\") on node \"crc\" DevicePath \"\"" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.447547 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.451979 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-bk9qw" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.489699 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddw2s\" (UniqueName: \"kubernetes.io/projected/58f8ec13-d004-4d1f-8a44-f325c07c25fd-kube-api-access-ddw2s\") pod \"redhat-marketplace-ghlj2\" (UID: \"58f8ec13-d004-4d1f-8a44-f325c07c25fd\") " pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.489754 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f8ec13-d004-4d1f-8a44-f325c07c25fd-utilities\") pod \"redhat-marketplace-ghlj2\" (UID: \"58f8ec13-d004-4d1f-8a44-f325c07c25fd\") " pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.489802 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f8ec13-d004-4d1f-8a44-f325c07c25fd-catalog-content\") pod \"redhat-marketplace-ghlj2\" (UID: \"58f8ec13-d004-4d1f-8a44-f325c07c25fd\") " pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.490235 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f8ec13-d004-4d1f-8a44-f325c07c25fd-utilities\") pod \"redhat-marketplace-ghlj2\" (UID: \"58f8ec13-d004-4d1f-8a44-f325c07c25fd\") " pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.490289 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f8ec13-d004-4d1f-8a44-f325c07c25fd-catalog-content\") pod \"redhat-marketplace-ghlj2\" (UID: \"58f8ec13-d004-4d1f-8a44-f325c07c25fd\") " pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.525348 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddw2s\" (UniqueName: \"kubernetes.io/projected/58f8ec13-d004-4d1f-8a44-f325c07c25fd-kube-api-access-ddw2s\") pod \"redhat-marketplace-ghlj2\" (UID: \"58f8ec13-d004-4d1f-8a44-f325c07c25fd\") " pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.585126 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.804499 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" event={"ID":"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57","Type":"ContainerStarted","Data":"c5841f37747c1c44c8558f8149dbb1dab60f47777cd6528d332a6505ec664a1c"} Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.804973 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.805095 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.808777 4776 generic.go:334] "Generic (PLEG): container finished" podID="ed9b52d1-9f5b-4d4b-aece-23ced00e6737" containerID="85b27af181d7ca69be29673dee0432ee4c99fb6a26722f7c1100ad43fd5922d9" exitCode=0 Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.808844 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7th2w" event={"ID":"ed9b52d1-9f5b-4d4b-aece-23ced00e6737","Type":"ContainerDied","Data":"85b27af181d7ca69be29673dee0432ee4c99fb6a26722f7c1100ad43fd5922d9"} Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.811261 4776 patch_prober.go:28] interesting pod/router-default-5444994796-xh8d5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 09:01:21 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 08 09:01:21 crc kubenswrapper[4776]: [+]process-running ok Dec 08 09:01:21 crc kubenswrapper[4776]: healthz check failed Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.811296 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.811325 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xh8d5" podUID="69d10b7f-1714-4536-800b-e7aa5bc7b73a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.817613 4776 generic.go:334] "Generic (PLEG): container finished" podID="9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" containerID="b76859968451e9c9aeb9184231ad6cbb61121a8dbd4a0b425070c0da82e3a55a" exitCode=0 Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.817671 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2qsw" event={"ID":"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac","Type":"ContainerDied","Data":"b76859968451e9c9aeb9184231ad6cbb61121a8dbd4a0b425070c0da82e3a55a"} Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.821773 4776 generic.go:334] "Generic (PLEG): container finished" podID="b6a339c1-f955-4a08-bba1-0df39a886324" containerID="fbcb29d00241802beedb3901d3bbfa7f9e3d28561312424366078ac538370e4a" exitCode=0 Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.821857 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swlvb" event={"ID":"b6a339c1-f955-4a08-bba1-0df39a886324","Type":"ContainerDied","Data":"fbcb29d00241802beedb3901d3bbfa7f9e3d28561312424366078ac538370e4a"} Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.825141 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" podStartSLOduration=138.825123491 podStartE2EDuration="2m18.825123491s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:21.824331959 +0000 UTC m=+158.087556981" watchObservedRunningTime="2025-12-08 09:01:21.825123491 +0000 UTC m=+158.088348523" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.861131 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" event={"ID":"06cf2358-4cba-4d69-81d1-dc02434fe460","Type":"ContainerDied","Data":"cc268b93a598f2b03c7028451835c5e72cf5121158eac57d4aeb4c221fa75059"} Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.861194 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc268b93a598f2b03c7028451835c5e72cf5121158eac57d4aeb4c221fa75059" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.861319 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.864029 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghlj2"] Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.869354 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fbmf6"] Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.870882 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.873117 4776 generic.go:334] "Generic (PLEG): container finished" podID="0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" containerID="7f4b22d9011a80a4e8ad069237a38f0c65307ae8d30af7804a1f3569464b229d" exitCode=0 Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.873919 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82j27" event={"ID":"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9","Type":"ContainerDied","Data":"7f4b22d9011a80a4e8ad069237a38f0c65307ae8d30af7804a1f3569464b229d"} Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.873952 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82j27" event={"ID":"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9","Type":"ContainerStarted","Data":"71b68a8eb858fd1b3e75bef1902efb78815ffec85a19de9d56cb17c0050ed8dd"} Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.874410 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.876898 4776 generic.go:334] "Generic (PLEG): container finished" podID="1122c2d9-2ef0-4527-be3f-5617003d2bc0" containerID="3fba4c7e859ac4f2fa12d87d3a1a5ca36e5495dd91e991f29e391c8acf541069" exitCode=0 Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.877877 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kcmf" event={"ID":"1122c2d9-2ef0-4527-be3f-5617003d2bc0","Type":"ContainerDied","Data":"3fba4c7e859ac4f2fa12d87d3a1a5ca36e5495dd91e991f29e391c8acf541069"} Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.902906 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbmf6"] Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.908552 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde03b49-eb1e-4941-b49e-e361cb8d83f4-catalog-content\") pod \"redhat-operators-fbmf6\" (UID: \"bde03b49-eb1e-4941-b49e-e361cb8d83f4\") " pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.908744 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde03b49-eb1e-4941-b49e-e361cb8d83f4-utilities\") pod \"redhat-operators-fbmf6\" (UID: \"bde03b49-eb1e-4941-b49e-e361cb8d83f4\") " pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:01:21 crc kubenswrapper[4776]: I1208 09:01:21.911065 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbt94\" (UniqueName: \"kubernetes.io/projected/bde03b49-eb1e-4941-b49e-e361cb8d83f4-kube-api-access-mbt94\") pod \"redhat-operators-fbmf6\" (UID: \"bde03b49-eb1e-4941-b49e-e361cb8d83f4\") " pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.012214 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde03b49-eb1e-4941-b49e-e361cb8d83f4-catalog-content\") pod \"redhat-operators-fbmf6\" (UID: \"bde03b49-eb1e-4941-b49e-e361cb8d83f4\") " pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.012394 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde03b49-eb1e-4941-b49e-e361cb8d83f4-utilities\") pod \"redhat-operators-fbmf6\" (UID: \"bde03b49-eb1e-4941-b49e-e361cb8d83f4\") " pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.012442 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbt94\" (UniqueName: \"kubernetes.io/projected/bde03b49-eb1e-4941-b49e-e361cb8d83f4-kube-api-access-mbt94\") pod \"redhat-operators-fbmf6\" (UID: \"bde03b49-eb1e-4941-b49e-e361cb8d83f4\") " pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.013855 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde03b49-eb1e-4941-b49e-e361cb8d83f4-catalog-content\") pod \"redhat-operators-fbmf6\" (UID: \"bde03b49-eb1e-4941-b49e-e361cb8d83f4\") " pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.016460 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde03b49-eb1e-4941-b49e-e361cb8d83f4-utilities\") pod \"redhat-operators-fbmf6\" (UID: \"bde03b49-eb1e-4941-b49e-e361cb8d83f4\") " pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.051525 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbt94\" (UniqueName: \"kubernetes.io/projected/bde03b49-eb1e-4941-b49e-e361cb8d83f4-kube-api-access-mbt94\") pod \"redhat-operators-fbmf6\" (UID: \"bde03b49-eb1e-4941-b49e-e361cb8d83f4\") " pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.263897 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mx2gt"] Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.265239 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.282492 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mx2gt"] Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.323837 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.418094 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3540eb34-736e-422d-b860-99cc44778fad-catalog-content\") pod \"redhat-operators-mx2gt\" (UID: \"3540eb34-736e-422d-b860-99cc44778fad\") " pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.418194 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9tfc\" (UniqueName: \"kubernetes.io/projected/3540eb34-736e-422d-b860-99cc44778fad-kube-api-access-j9tfc\") pod \"redhat-operators-mx2gt\" (UID: \"3540eb34-736e-422d-b860-99cc44778fad\") " pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.418219 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3540eb34-736e-422d-b860-99cc44778fad-utilities\") pod \"redhat-operators-mx2gt\" (UID: \"3540eb34-736e-422d-b860-99cc44778fad\") " pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.511033 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.512683 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.518740 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.519122 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.521031 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3540eb34-736e-422d-b860-99cc44778fad-catalog-content\") pod \"redhat-operators-mx2gt\" (UID: \"3540eb34-736e-422d-b860-99cc44778fad\") " pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.521116 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9tfc\" (UniqueName: \"kubernetes.io/projected/3540eb34-736e-422d-b860-99cc44778fad-kube-api-access-j9tfc\") pod \"redhat-operators-mx2gt\" (UID: \"3540eb34-736e-422d-b860-99cc44778fad\") " pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.521143 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3540eb34-736e-422d-b860-99cc44778fad-utilities\") pod \"redhat-operators-mx2gt\" (UID: \"3540eb34-736e-422d-b860-99cc44778fad\") " pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.521901 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3540eb34-736e-422d-b860-99cc44778fad-utilities\") pod \"redhat-operators-mx2gt\" (UID: \"3540eb34-736e-422d-b860-99cc44778fad\") " pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.522311 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3540eb34-736e-422d-b860-99cc44778fad-catalog-content\") pod \"redhat-operators-mx2gt\" (UID: \"3540eb34-736e-422d-b860-99cc44778fad\") " pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.523697 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.581140 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9tfc\" (UniqueName: \"kubernetes.io/projected/3540eb34-736e-422d-b860-99cc44778fad-kube-api-access-j9tfc\") pod \"redhat-operators-mx2gt\" (UID: \"3540eb34-736e-422d-b860-99cc44778fad\") " pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.586102 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.632894 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b23b4e27-5963-43e9-b9db-e7062e38b241-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b23b4e27-5963-43e9-b9db-e7062e38b241\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.632998 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b23b4e27-5963-43e9-b9db-e7062e38b241-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b23b4e27-5963-43e9-b9db-e7062e38b241\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.688650 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbmf6"] Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.734243 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b23b4e27-5963-43e9-b9db-e7062e38b241-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b23b4e27-5963-43e9-b9db-e7062e38b241\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.734327 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b23b4e27-5963-43e9-b9db-e7062e38b241-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b23b4e27-5963-43e9-b9db-e7062e38b241\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.734417 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b23b4e27-5963-43e9-b9db-e7062e38b241-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b23b4e27-5963-43e9-b9db-e7062e38b241\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.752423 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b23b4e27-5963-43e9-b9db-e7062e38b241-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b23b4e27-5963-43e9-b9db-e7062e38b241\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.811509 4776 patch_prober.go:28] interesting pod/router-default-5444994796-xh8d5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 09:01:22 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 08 09:01:22 crc kubenswrapper[4776]: [+]process-running ok Dec 08 09:01:22 crc kubenswrapper[4776]: healthz check failed Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.811610 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xh8d5" podUID="69d10b7f-1714-4536-800b-e7aa5bc7b73a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.851198 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mx2gt"] Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.864796 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.911112 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghlj2" event={"ID":"58f8ec13-d004-4d1f-8a44-f325c07c25fd","Type":"ContainerStarted","Data":"aea0d8c2c31b696ad870686b9fca8ef8ba874325e3ed7193a3e6febc0e3111c3"} Dec 08 09:01:22 crc kubenswrapper[4776]: I1208 09:01:22.913227 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbmf6" event={"ID":"bde03b49-eb1e-4941-b49e-e361cb8d83f4","Type":"ContainerStarted","Data":"8f9a0df32342894c6eff6c89a36d5563930510cf300e754f0ca9afb490a9e75c"} Dec 08 09:01:23 crc kubenswrapper[4776]: I1208 09:01:23.122829 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 08 09:01:23 crc kubenswrapper[4776]: W1208 09:01:23.241301 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb23b4e27_5963_43e9_b9db_e7062e38b241.slice/crio-0bd351d2e7fd9845f9199b1e5839b0e2d46bed5a1bbc510866d31febb2113202 WatchSource:0}: Error finding container 0bd351d2e7fd9845f9199b1e5839b0e2d46bed5a1bbc510866d31febb2113202: Status 404 returned error can't find the container with id 0bd351d2e7fd9845f9199b1e5839b0e2d46bed5a1bbc510866d31febb2113202 Dec 08 09:01:23 crc kubenswrapper[4776]: I1208 09:01:23.810531 4776 patch_prober.go:28] interesting pod/router-default-5444994796-xh8d5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 09:01:23 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 08 09:01:23 crc kubenswrapper[4776]: [+]process-running ok Dec 08 09:01:23 crc kubenswrapper[4776]: healthz check failed Dec 08 09:01:23 crc kubenswrapper[4776]: I1208 09:01:23.811075 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xh8d5" podUID="69d10b7f-1714-4536-800b-e7aa5bc7b73a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 09:01:23 crc kubenswrapper[4776]: I1208 09:01:23.919198 4776 generic.go:334] "Generic (PLEG): container finished" podID="3540eb34-736e-422d-b860-99cc44778fad" containerID="e8cd980527a695530adcca26eda2c33a0aef44e5405ca49af61bcfcd8a806555" exitCode=0 Dec 08 09:01:23 crc kubenswrapper[4776]: I1208 09:01:23.919246 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx2gt" event={"ID":"3540eb34-736e-422d-b860-99cc44778fad","Type":"ContainerDied","Data":"e8cd980527a695530adcca26eda2c33a0aef44e5405ca49af61bcfcd8a806555"} Dec 08 09:01:23 crc kubenswrapper[4776]: I1208 09:01:23.919287 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx2gt" event={"ID":"3540eb34-736e-422d-b860-99cc44778fad","Type":"ContainerStarted","Data":"b905e23469e1eaf9fdbea3d3b1246bc6659634643af0c05285dc26b9e40fae78"} Dec 08 09:01:23 crc kubenswrapper[4776]: I1208 09:01:23.920962 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b23b4e27-5963-43e9-b9db-e7062e38b241","Type":"ContainerStarted","Data":"0bd351d2e7fd9845f9199b1e5839b0e2d46bed5a1bbc510866d31febb2113202"} Dec 08 09:01:23 crc kubenswrapper[4776]: I1208 09:01:23.925319 4776 generic.go:334] "Generic (PLEG): container finished" podID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" containerID="ce50275cd50f4268a44d85f21becbf48fb418d6143c57e75d3493d3d1c2ecd4d" exitCode=0 Dec 08 09:01:23 crc kubenswrapper[4776]: I1208 09:01:23.926157 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghlj2" event={"ID":"58f8ec13-d004-4d1f-8a44-f325c07c25fd","Type":"ContainerDied","Data":"ce50275cd50f4268a44d85f21becbf48fb418d6143c57e75d3493d3d1c2ecd4d"} Dec 08 09:01:23 crc kubenswrapper[4776]: I1208 09:01:23.928124 4776 generic.go:334] "Generic (PLEG): container finished" podID="bde03b49-eb1e-4941-b49e-e361cb8d83f4" containerID="e8e8a636749d16442469a262b654d77373c822f9ce11dff6319fa5b27aec9a27" exitCode=0 Dec 08 09:01:23 crc kubenswrapper[4776]: I1208 09:01:23.928187 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbmf6" event={"ID":"bde03b49-eb1e-4941-b49e-e361cb8d83f4","Type":"ContainerDied","Data":"e8e8a636749d16442469a262b654d77373c822f9ce11dff6319fa5b27aec9a27"} Dec 08 09:01:24 crc kubenswrapper[4776]: I1208 09:01:24.808694 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:24 crc kubenswrapper[4776]: I1208 09:01:24.812339 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-xh8d5" Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.419399 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.420536 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.424304 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.424382 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.437788 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.475116 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/590b145f-0421-4422-bf55-ed2a57c3ead0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"590b145f-0421-4422-bf55-ed2a57c3ead0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.475206 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/590b145f-0421-4422-bf55-ed2a57c3ead0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"590b145f-0421-4422-bf55-ed2a57c3ead0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.576380 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/590b145f-0421-4422-bf55-ed2a57c3ead0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"590b145f-0421-4422-bf55-ed2a57c3ead0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.576459 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/590b145f-0421-4422-bf55-ed2a57c3ead0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"590b145f-0421-4422-bf55-ed2a57c3ead0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.576553 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/590b145f-0421-4422-bf55-ed2a57c3ead0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"590b145f-0421-4422-bf55-ed2a57c3ead0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.593759 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/590b145f-0421-4422-bf55-ed2a57c3ead0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"590b145f-0421-4422-bf55-ed2a57c3ead0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.749577 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.879692 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs\") pod \"network-metrics-daemon-kkhjg\" (UID: \"99143b9c-a541-4c0e-8387-0dff0d557974\") " pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.883613 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99143b9c-a541-4c0e-8387-0dff0d557974-metrics-certs\") pod \"network-metrics-daemon-kkhjg\" (UID: \"99143b9c-a541-4c0e-8387-0dff0d557974\") " pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.927672 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 08 09:01:25 crc kubenswrapper[4776]: W1208 09:01:25.936226 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod590b145f_0421_4422_bf55_ed2a57c3ead0.slice/crio-dd92932d76efb5ed214ae8d7db2ce945c3e0aaf10dbe6ac05b74fe0c9a6eca20 WatchSource:0}: Error finding container dd92932d76efb5ed214ae8d7db2ce945c3e0aaf10dbe6ac05b74fe0c9a6eca20: Status 404 returned error can't find the container with id dd92932d76efb5ed214ae8d7db2ce945c3e0aaf10dbe6ac05b74fe0c9a6eca20 Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.943787 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b23b4e27-5963-43e9-b9db-e7062e38b241","Type":"ContainerStarted","Data":"a8ba689115a73b5748a8c9ff139eb83cef0d5934746d418c57bba889333017ce"} Dec 08 09:01:25 crc kubenswrapper[4776]: I1208 09:01:25.969814 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kkhjg" Dec 08 09:01:26 crc kubenswrapper[4776]: I1208 09:01:26.136527 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kkhjg"] Dec 08 09:01:26 crc kubenswrapper[4776]: W1208 09:01:26.144403 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99143b9c_a541_4c0e_8387_0dff0d557974.slice/crio-c9a8e7c36ab397a9a3bcc443e2ee817d874a0921121cdcedae4c0e7828933ed8 WatchSource:0}: Error finding container c9a8e7c36ab397a9a3bcc443e2ee817d874a0921121cdcedae4c0e7828933ed8: Status 404 returned error can't find the container with id c9a8e7c36ab397a9a3bcc443e2ee817d874a0921121cdcedae4c0e7828933ed8 Dec 08 09:01:26 crc kubenswrapper[4776]: I1208 09:01:26.928078 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dzfmd" Dec 08 09:01:26 crc kubenswrapper[4776]: I1208 09:01:26.969474 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"590b145f-0421-4422-bf55-ed2a57c3ead0","Type":"ContainerStarted","Data":"dd92932d76efb5ed214ae8d7db2ce945c3e0aaf10dbe6ac05b74fe0c9a6eca20"} Dec 08 09:01:26 crc kubenswrapper[4776]: I1208 09:01:26.971266 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" event={"ID":"99143b9c-a541-4c0e-8387-0dff0d557974","Type":"ContainerStarted","Data":"c9a8e7c36ab397a9a3bcc443e2ee817d874a0921121cdcedae4c0e7828933ed8"} Dec 08 09:01:27 crc kubenswrapper[4776]: I1208 09:01:27.979339 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"590b145f-0421-4422-bf55-ed2a57c3ead0","Type":"ContainerStarted","Data":"2857fdb7aadcc3aabc9fd388062f06c5dd253e84f54ab33bda2596b811c83646"} Dec 08 09:01:27 crc kubenswrapper[4776]: I1208 09:01:27.982327 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" event={"ID":"99143b9c-a541-4c0e-8387-0dff0d557974","Type":"ContainerStarted","Data":"811814d8b2396edeb9047aecb908b909f28193a86e5ed7887e2d81559d9732a6"} Dec 08 09:01:27 crc kubenswrapper[4776]: I1208 09:01:27.984760 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-59m6v_10462781-68cf-4a10-b7c7-b9700465d964/cluster-samples-operator/0.log" Dec 08 09:01:27 crc kubenswrapper[4776]: I1208 09:01:27.984797 4776 generic.go:334] "Generic (PLEG): container finished" podID="10462781-68cf-4a10-b7c7-b9700465d964" containerID="a6d878c8fbc260a5a65029c4ee38e694f9644c2d856e2dabc80689f3095808ed" exitCode=2 Dec 08 09:01:27 crc kubenswrapper[4776]: I1208 09:01:27.984826 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v" event={"ID":"10462781-68cf-4a10-b7c7-b9700465d964","Type":"ContainerDied","Data":"a6d878c8fbc260a5a65029c4ee38e694f9644c2d856e2dabc80689f3095808ed"} Dec 08 09:01:27 crc kubenswrapper[4776]: I1208 09:01:27.985162 4776 scope.go:117] "RemoveContainer" containerID="a6d878c8fbc260a5a65029c4ee38e694f9644c2d856e2dabc80689f3095808ed" Dec 08 09:01:27 crc kubenswrapper[4776]: I1208 09:01:27.986334 4776 generic.go:334] "Generic (PLEG): container finished" podID="b23b4e27-5963-43e9-b9db-e7062e38b241" containerID="a8ba689115a73b5748a8c9ff139eb83cef0d5934746d418c57bba889333017ce" exitCode=0 Dec 08 09:01:27 crc kubenswrapper[4776]: I1208 09:01:27.986361 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b23b4e27-5963-43e9-b9db-e7062e38b241","Type":"ContainerDied","Data":"a8ba689115a73b5748a8c9ff139eb83cef0d5934746d418c57bba889333017ce"} Dec 08 09:01:28 crc kubenswrapper[4776]: I1208 09:01:28.003944 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.00392797 podStartE2EDuration="3.00392797s" podCreationTimestamp="2025-12-08 09:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:27.998096475 +0000 UTC m=+164.261321507" watchObservedRunningTime="2025-12-08 09:01:28.00392797 +0000 UTC m=+164.267152992" Dec 08 09:01:29 crc kubenswrapper[4776]: I1208 09:01:29.020966 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kkhjg" event={"ID":"99143b9c-a541-4c0e-8387-0dff0d557974","Type":"ContainerStarted","Data":"f6ec3c1bee87db537af43009ddcf219c40d96ff2a847d2e434cdee09472c72c3"} Dec 08 09:01:29 crc kubenswrapper[4776]: I1208 09:01:29.027021 4776 generic.go:334] "Generic (PLEG): container finished" podID="590b145f-0421-4422-bf55-ed2a57c3ead0" containerID="2857fdb7aadcc3aabc9fd388062f06c5dd253e84f54ab33bda2596b811c83646" exitCode=0 Dec 08 09:01:29 crc kubenswrapper[4776]: I1208 09:01:29.027108 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"590b145f-0421-4422-bf55-ed2a57c3ead0","Type":"ContainerDied","Data":"2857fdb7aadcc3aabc9fd388062f06c5dd253e84f54ab33bda2596b811c83646"} Dec 08 09:01:29 crc kubenswrapper[4776]: I1208 09:01:29.318048 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 09:01:29 crc kubenswrapper[4776]: I1208 09:01:29.324220 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b23b4e27-5963-43e9-b9db-e7062e38b241-kube-api-access\") pod \"b23b4e27-5963-43e9-b9db-e7062e38b241\" (UID: \"b23b4e27-5963-43e9-b9db-e7062e38b241\") " Dec 08 09:01:29 crc kubenswrapper[4776]: I1208 09:01:29.324323 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b23b4e27-5963-43e9-b9db-e7062e38b241-kubelet-dir\") pod \"b23b4e27-5963-43e9-b9db-e7062e38b241\" (UID: \"b23b4e27-5963-43e9-b9db-e7062e38b241\") " Dec 08 09:01:29 crc kubenswrapper[4776]: I1208 09:01:29.324494 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b23b4e27-5963-43e9-b9db-e7062e38b241-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b23b4e27-5963-43e9-b9db-e7062e38b241" (UID: "b23b4e27-5963-43e9-b9db-e7062e38b241"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:01:29 crc kubenswrapper[4776]: I1208 09:01:29.334879 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b23b4e27-5963-43e9-b9db-e7062e38b241-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b23b4e27-5963-43e9-b9db-e7062e38b241" (UID: "b23b4e27-5963-43e9-b9db-e7062e38b241"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:01:29 crc kubenswrapper[4776]: I1208 09:01:29.426976 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b23b4e27-5963-43e9-b9db-e7062e38b241-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 08 09:01:29 crc kubenswrapper[4776]: I1208 09:01:29.427016 4776 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b23b4e27-5963-43e9-b9db-e7062e38b241-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 08 09:01:30 crc kubenswrapper[4776]: I1208 09:01:30.060930 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-59m6v_10462781-68cf-4a10-b7c7-b9700465d964/cluster-samples-operator/0.log" Dec 08 09:01:30 crc kubenswrapper[4776]: I1208 09:01:30.061046 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-59m6v" event={"ID":"10462781-68cf-4a10-b7c7-b9700465d964","Type":"ContainerStarted","Data":"db72813d3696379c705262c6899aeee2b08693a35d800307729e5975b5bffcc0"} Dec 08 09:01:30 crc kubenswrapper[4776]: I1208 09:01:30.065253 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 09:01:30 crc kubenswrapper[4776]: I1208 09:01:30.065299 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b23b4e27-5963-43e9-b9db-e7062e38b241","Type":"ContainerDied","Data":"0bd351d2e7fd9845f9199b1e5839b0e2d46bed5a1bbc510866d31febb2113202"} Dec 08 09:01:30 crc kubenswrapper[4776]: I1208 09:01:30.065336 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bd351d2e7fd9845f9199b1e5839b0e2d46bed5a1bbc510866d31febb2113202" Dec 08 09:01:30 crc kubenswrapper[4776]: I1208 09:01:30.173332 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kkhjg" podStartSLOduration=147.173314281 podStartE2EDuration="2m27.173314281s" podCreationTimestamp="2025-12-08 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:01:30.168876913 +0000 UTC m=+166.432101945" watchObservedRunningTime="2025-12-08 09:01:30.173314281 +0000 UTC m=+166.436539303" Dec 08 09:01:30 crc kubenswrapper[4776]: I1208 09:01:30.437230 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 09:01:30 crc kubenswrapper[4776]: I1208 09:01:30.545053 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/590b145f-0421-4422-bf55-ed2a57c3ead0-kube-api-access\") pod \"590b145f-0421-4422-bf55-ed2a57c3ead0\" (UID: \"590b145f-0421-4422-bf55-ed2a57c3ead0\") " Dec 08 09:01:30 crc kubenswrapper[4776]: I1208 09:01:30.545110 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/590b145f-0421-4422-bf55-ed2a57c3ead0-kubelet-dir\") pod \"590b145f-0421-4422-bf55-ed2a57c3ead0\" (UID: \"590b145f-0421-4422-bf55-ed2a57c3ead0\") " Dec 08 09:01:30 crc kubenswrapper[4776]: I1208 09:01:30.545502 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/590b145f-0421-4422-bf55-ed2a57c3ead0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "590b145f-0421-4422-bf55-ed2a57c3ead0" (UID: "590b145f-0421-4422-bf55-ed2a57c3ead0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:01:30 crc kubenswrapper[4776]: I1208 09:01:30.551320 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590b145f-0421-4422-bf55-ed2a57c3ead0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "590b145f-0421-4422-bf55-ed2a57c3ead0" (UID: "590b145f-0421-4422-bf55-ed2a57c3ead0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:01:30 crc kubenswrapper[4776]: I1208 09:01:30.647257 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/590b145f-0421-4422-bf55-ed2a57c3ead0-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 08 09:01:30 crc kubenswrapper[4776]: I1208 09:01:30.647307 4776 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/590b145f-0421-4422-bf55-ed2a57c3ead0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 08 09:01:30 crc kubenswrapper[4776]: I1208 09:01:30.741834 4776 patch_prober.go:28] interesting pod/console-f9d7485db-8dm9l container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 08 09:01:30 crc kubenswrapper[4776]: I1208 09:01:30.741895 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8dm9l" podUID="b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 08 09:01:30 crc kubenswrapper[4776]: I1208 09:01:30.824476 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-559sf" Dec 08 09:01:31 crc kubenswrapper[4776]: I1208 09:01:31.092737 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"590b145f-0421-4422-bf55-ed2a57c3ead0","Type":"ContainerDied","Data":"dd92932d76efb5ed214ae8d7db2ce945c3e0aaf10dbe6ac05b74fe0c9a6eca20"} Dec 08 09:01:31 crc kubenswrapper[4776]: I1208 09:01:31.092782 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd92932d76efb5ed214ae8d7db2ce945c3e0aaf10dbe6ac05b74fe0c9a6eca20" Dec 08 09:01:31 crc kubenswrapper[4776]: I1208 09:01:31.092831 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 09:01:38 crc kubenswrapper[4776]: I1208 09:01:38.984529 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:01:40 crc kubenswrapper[4776]: I1208 09:01:40.066493 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 09:01:40 crc kubenswrapper[4776]: I1208 09:01:40.747411 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:40 crc kubenswrapper[4776]: I1208 09:01:40.750669 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:01:41 crc kubenswrapper[4776]: I1208 09:01:41.398658 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:01:41 crc kubenswrapper[4776]: I1208 09:01:41.398725 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:01:51 crc kubenswrapper[4776]: I1208 09:01:51.759040 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6z2v5" Dec 08 09:01:58 crc kubenswrapper[4776]: I1208 09:01:58.836712 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 08 09:01:58 crc kubenswrapper[4776]: E1208 09:01:58.837292 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23b4e27-5963-43e9-b9db-e7062e38b241" containerName="pruner" Dec 08 09:01:58 crc kubenswrapper[4776]: I1208 09:01:58.837307 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23b4e27-5963-43e9-b9db-e7062e38b241" containerName="pruner" Dec 08 09:01:58 crc kubenswrapper[4776]: E1208 09:01:58.837324 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590b145f-0421-4422-bf55-ed2a57c3ead0" containerName="pruner" Dec 08 09:01:58 crc kubenswrapper[4776]: I1208 09:01:58.837340 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="590b145f-0421-4422-bf55-ed2a57c3ead0" containerName="pruner" Dec 08 09:01:58 crc kubenswrapper[4776]: I1208 09:01:58.837457 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="590b145f-0421-4422-bf55-ed2a57c3ead0" containerName="pruner" Dec 08 09:01:58 crc kubenswrapper[4776]: I1208 09:01:58.837479 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23b4e27-5963-43e9-b9db-e7062e38b241" containerName="pruner" Dec 08 09:01:58 crc kubenswrapper[4776]: I1208 09:01:58.837898 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 09:01:58 crc kubenswrapper[4776]: I1208 09:01:58.840338 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 08 09:01:58 crc kubenswrapper[4776]: I1208 09:01:58.840911 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 08 09:01:58 crc kubenswrapper[4776]: I1208 09:01:58.843826 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 08 09:01:58 crc kubenswrapper[4776]: I1208 09:01:58.986635 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 09:01:58 crc kubenswrapper[4776]: I1208 09:01:58.986679 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 09:01:59 crc kubenswrapper[4776]: I1208 09:01:59.087800 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 09:01:59 crc kubenswrapper[4776]: I1208 09:01:59.087852 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 09:01:59 crc kubenswrapper[4776]: I1208 09:01:59.087982 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 09:02:01 crc kubenswrapper[4776]: I1208 09:02:01.505023 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 09:02:01 crc kubenswrapper[4776]: I1208 09:02:01.558389 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 09:02:04 crc kubenswrapper[4776]: I1208 09:02:04.423310 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 08 09:02:04 crc kubenswrapper[4776]: I1208 09:02:04.424774 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 08 09:02:04 crc kubenswrapper[4776]: I1208 09:02:04.424872 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 08 09:02:04 crc kubenswrapper[4776]: I1208 09:02:04.556581 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2345c2a2-29d5-4542-b83f-7e57fcd16d77-kube-api-access\") pod \"installer-9-crc\" (UID: \"2345c2a2-29d5-4542-b83f-7e57fcd16d77\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 09:02:04 crc kubenswrapper[4776]: I1208 09:02:04.557385 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2345c2a2-29d5-4542-b83f-7e57fcd16d77-kubelet-dir\") pod \"installer-9-crc\" (UID: \"2345c2a2-29d5-4542-b83f-7e57fcd16d77\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 09:02:04 crc kubenswrapper[4776]: I1208 09:02:04.557473 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2345c2a2-29d5-4542-b83f-7e57fcd16d77-var-lock\") pod \"installer-9-crc\" (UID: \"2345c2a2-29d5-4542-b83f-7e57fcd16d77\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 09:02:04 crc kubenswrapper[4776]: I1208 09:02:04.658698 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2345c2a2-29d5-4542-b83f-7e57fcd16d77-kube-api-access\") pod \"installer-9-crc\" (UID: \"2345c2a2-29d5-4542-b83f-7e57fcd16d77\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 09:02:04 crc kubenswrapper[4776]: I1208 09:02:04.658806 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2345c2a2-29d5-4542-b83f-7e57fcd16d77-kubelet-dir\") pod \"installer-9-crc\" (UID: \"2345c2a2-29d5-4542-b83f-7e57fcd16d77\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 09:02:04 crc kubenswrapper[4776]: I1208 09:02:04.658838 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2345c2a2-29d5-4542-b83f-7e57fcd16d77-var-lock\") pod \"installer-9-crc\" (UID: \"2345c2a2-29d5-4542-b83f-7e57fcd16d77\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 09:02:04 crc kubenswrapper[4776]: I1208 09:02:04.658905 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2345c2a2-29d5-4542-b83f-7e57fcd16d77-var-lock\") pod \"installer-9-crc\" (UID: \"2345c2a2-29d5-4542-b83f-7e57fcd16d77\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 09:02:04 crc kubenswrapper[4776]: I1208 09:02:04.658939 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2345c2a2-29d5-4542-b83f-7e57fcd16d77-kubelet-dir\") pod \"installer-9-crc\" (UID: \"2345c2a2-29d5-4542-b83f-7e57fcd16d77\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 09:02:04 crc kubenswrapper[4776]: I1208 09:02:04.681307 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2345c2a2-29d5-4542-b83f-7e57fcd16d77-kube-api-access\") pod \"installer-9-crc\" (UID: \"2345c2a2-29d5-4542-b83f-7e57fcd16d77\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 09:02:04 crc kubenswrapper[4776]: I1208 09:02:04.753668 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 08 09:02:11 crc kubenswrapper[4776]: E1208 09:02:11.076573 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 08 09:02:11 crc kubenswrapper[4776]: E1208 09:02:11.077441 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbt94,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-fbmf6_openshift-marketplace(bde03b49-eb1e-4941-b49e-e361cb8d83f4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 08 09:02:11 crc kubenswrapper[4776]: E1208 09:02:11.079540 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-fbmf6" podUID="bde03b49-eb1e-4941-b49e-e361cb8d83f4" Dec 08 09:02:11 crc kubenswrapper[4776]: I1208 09:02:11.399625 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:02:11 crc kubenswrapper[4776]: I1208 09:02:11.399696 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:02:11 crc kubenswrapper[4776]: I1208 09:02:11.399779 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 09:02:11 crc kubenswrapper[4776]: I1208 09:02:11.400432 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 09:02:11 crc kubenswrapper[4776]: I1208 09:02:11.400545 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8" gracePeriod=600 Dec 08 09:02:12 crc kubenswrapper[4776]: I1208 09:02:12.479561 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8" exitCode=0 Dec 08 09:02:12 crc kubenswrapper[4776]: I1208 09:02:12.479813 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8"} Dec 08 09:02:12 crc kubenswrapper[4776]: E1208 09:02:12.865078 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 08 09:02:12 crc kubenswrapper[4776]: E1208 09:02:12.865384 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j9tfc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-mx2gt_openshift-marketplace(3540eb34-736e-422d-b860-99cc44778fad): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 08 09:02:12 crc kubenswrapper[4776]: E1208 09:02:12.867042 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-mx2gt" podUID="3540eb34-736e-422d-b860-99cc44778fad" Dec 08 09:02:13 crc kubenswrapper[4776]: E1208 09:02:13.116067 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-fbmf6" podUID="bde03b49-eb1e-4941-b49e-e361cb8d83f4" Dec 08 09:02:13 crc kubenswrapper[4776]: E1208 09:02:13.257800 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 08 09:02:13 crc kubenswrapper[4776]: E1208 09:02:13.258009 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5v44b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8kcmf_openshift-marketplace(1122c2d9-2ef0-4527-be3f-5617003d2bc0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 08 09:02:13 crc kubenswrapper[4776]: E1208 09:02:13.259220 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8kcmf" podUID="1122c2d9-2ef0-4527-be3f-5617003d2bc0" Dec 08 09:02:14 crc kubenswrapper[4776]: E1208 09:02:14.655389 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-mx2gt" podUID="3540eb34-736e-422d-b860-99cc44778fad" Dec 08 09:02:14 crc kubenswrapper[4776]: E1208 09:02:14.655460 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-8kcmf" podUID="1122c2d9-2ef0-4527-be3f-5617003d2bc0" Dec 08 09:02:14 crc kubenswrapper[4776]: E1208 09:02:14.730367 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 08 09:02:14 crc kubenswrapper[4776]: E1208 09:02:14.730704 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xntmr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-7th2w_openshift-marketplace(ed9b52d1-9f5b-4d4b-aece-23ced00e6737): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 08 09:02:14 crc kubenswrapper[4776]: E1208 09:02:14.731921 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-7th2w" podUID="ed9b52d1-9f5b-4d4b-aece-23ced00e6737" Dec 08 09:02:14 crc kubenswrapper[4776]: E1208 09:02:14.776270 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 08 09:02:14 crc kubenswrapper[4776]: E1208 09:02:14.776431 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kqkrn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-swlvb_openshift-marketplace(b6a339c1-f955-4a08-bba1-0df39a886324): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 08 09:02:14 crc kubenswrapper[4776]: E1208 09:02:14.777608 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-swlvb" podUID="b6a339c1-f955-4a08-bba1-0df39a886324" Dec 08 09:02:15 crc kubenswrapper[4776]: E1208 09:02:15.738352 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-7th2w" podUID="ed9b52d1-9f5b-4d4b-aece-23ced00e6737" Dec 08 09:02:15 crc kubenswrapper[4776]: E1208 09:02:15.739089 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-swlvb" podUID="b6a339c1-f955-4a08-bba1-0df39a886324" Dec 08 09:02:16 crc kubenswrapper[4776]: I1208 09:02:16.179232 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 08 09:02:16 crc kubenswrapper[4776]: W1208 09:02:16.194659 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2e4d9ac1_0fc3_4c58_8c6a_99c9653e4a90.slice/crio-0a24d6f5fabf3773f98cbf7b1a4f92c3338f52aa7342f98a13ca425ee1262a6d WatchSource:0}: Error finding container 0a24d6f5fabf3773f98cbf7b1a4f92c3338f52aa7342f98a13ca425ee1262a6d: Status 404 returned error can't find the container with id 0a24d6f5fabf3773f98cbf7b1a4f92c3338f52aa7342f98a13ca425ee1262a6d Dec 08 09:02:16 crc kubenswrapper[4776]: I1208 09:02:16.231591 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 08 09:02:16 crc kubenswrapper[4776]: W1208 09:02:16.414127 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2345c2a2_29d5_4542_b83f_7e57fcd16d77.slice/crio-f89572d1c0cea5e655c60e6f168e1bb2b119f9c95224adf03c6a42540f0c9977 WatchSource:0}: Error finding container f89572d1c0cea5e655c60e6f168e1bb2b119f9c95224adf03c6a42540f0c9977: Status 404 returned error can't find the container with id f89572d1c0cea5e655c60e6f168e1bb2b119f9c95224adf03c6a42540f0c9977 Dec 08 09:02:16 crc kubenswrapper[4776]: I1208 09:02:16.499245 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"2345c2a2-29d5-4542-b83f-7e57fcd16d77","Type":"ContainerStarted","Data":"f89572d1c0cea5e655c60e6f168e1bb2b119f9c95224adf03c6a42540f0c9977"} Dec 08 09:02:16 crc kubenswrapper[4776]: I1208 09:02:16.500390 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90","Type":"ContainerStarted","Data":"0a24d6f5fabf3773f98cbf7b1a4f92c3338f52aa7342f98a13ca425ee1262a6d"} Dec 08 09:02:17 crc kubenswrapper[4776]: E1208 09:02:17.982497 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 08 09:02:17 crc kubenswrapper[4776]: E1208 09:02:17.982863 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ddw2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ghlj2_openshift-marketplace(58f8ec13-d004-4d1f-8a44-f325c07c25fd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 08 09:02:17 crc kubenswrapper[4776]: E1208 09:02:17.984098 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ghlj2" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" Dec 08 09:02:18 crc kubenswrapper[4776]: I1208 09:02:18.514893 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"3686f95c2750ae2f6fecf0ef1b9e49c85b6866553ae81497ae9ec17dd913386b"} Dec 08 09:02:18 crc kubenswrapper[4776]: I1208 09:02:18.516838 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90","Type":"ContainerStarted","Data":"85dfe5be31785031a65cac7cbce3217a1817ff755617cafb97874290bd4a674c"} Dec 08 09:02:18 crc kubenswrapper[4776]: I1208 09:02:18.517994 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"2345c2a2-29d5-4542-b83f-7e57fcd16d77","Type":"ContainerStarted","Data":"742fe1e7633bc3057c42bc9e6c5cb47c25d53af64951671d9b99674680e1972a"} Dec 08 09:02:18 crc kubenswrapper[4776]: E1208 09:02:18.519069 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ghlj2" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" Dec 08 09:02:18 crc kubenswrapper[4776]: I1208 09:02:18.553875 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=20.553855279 podStartE2EDuration="20.553855279s" podCreationTimestamp="2025-12-08 09:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:02:18.553074198 +0000 UTC m=+214.816299240" watchObservedRunningTime="2025-12-08 09:02:18.553855279 +0000 UTC m=+214.817080321" Dec 08 09:02:18 crc kubenswrapper[4776]: I1208 09:02:18.590151 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=14.590131619 podStartE2EDuration="14.590131619s" podCreationTimestamp="2025-12-08 09:02:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:02:18.589138633 +0000 UTC m=+214.852363655" watchObservedRunningTime="2025-12-08 09:02:18.590131619 +0000 UTC m=+214.853356641" Dec 08 09:02:18 crc kubenswrapper[4776]: E1208 09:02:18.591232 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 08 09:02:18 crc kubenswrapper[4776]: E1208 09:02:18.591379 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mj4sg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-z2qsw_openshift-marketplace(9f1cf0fc-eed0-4fba-8b89-b29bf78cadac): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 08 09:02:18 crc kubenswrapper[4776]: E1208 09:02:18.592695 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-z2qsw" podUID="9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" Dec 08 09:02:18 crc kubenswrapper[4776]: E1208 09:02:18.650895 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 08 09:02:18 crc kubenswrapper[4776]: E1208 09:02:18.651078 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqkcs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-82j27_openshift-marketplace(0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 08 09:02:18 crc kubenswrapper[4776]: E1208 09:02:18.652543 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-82j27" podUID="0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" Dec 08 09:02:19 crc kubenswrapper[4776]: I1208 09:02:19.528812 4776 generic.go:334] "Generic (PLEG): container finished" podID="2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90" containerID="85dfe5be31785031a65cac7cbce3217a1817ff755617cafb97874290bd4a674c" exitCode=0 Dec 08 09:02:19 crc kubenswrapper[4776]: I1208 09:02:19.528941 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90","Type":"ContainerDied","Data":"85dfe5be31785031a65cac7cbce3217a1817ff755617cafb97874290bd4a674c"} Dec 08 09:02:19 crc kubenswrapper[4776]: E1208 09:02:19.533657 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-z2qsw" podUID="9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" Dec 08 09:02:19 crc kubenswrapper[4776]: E1208 09:02:19.533904 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-82j27" podUID="0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" Dec 08 09:02:20 crc kubenswrapper[4776]: I1208 09:02:20.833674 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 09:02:21 crc kubenswrapper[4776]: I1208 09:02:21.026391 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90-kubelet-dir\") pod \"2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90\" (UID: \"2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90\") " Dec 08 09:02:21 crc kubenswrapper[4776]: I1208 09:02:21.026493 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90-kube-api-access\") pod \"2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90\" (UID: \"2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90\") " Dec 08 09:02:21 crc kubenswrapper[4776]: I1208 09:02:21.026531 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90" (UID: "2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:02:21 crc kubenswrapper[4776]: I1208 09:02:21.026984 4776 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:21 crc kubenswrapper[4776]: I1208 09:02:21.031973 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90" (UID: "2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:02:21 crc kubenswrapper[4776]: I1208 09:02:21.128465 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:21 crc kubenswrapper[4776]: I1208 09:02:21.549700 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90","Type":"ContainerDied","Data":"0a24d6f5fabf3773f98cbf7b1a4f92c3338f52aa7342f98a13ca425ee1262a6d"} Dec 08 09:02:21 crc kubenswrapper[4776]: I1208 09:02:21.550062 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a24d6f5fabf3773f98cbf7b1a4f92c3338f52aa7342f98a13ca425ee1262a6d" Dec 08 09:02:21 crc kubenswrapper[4776]: I1208 09:02:21.550114 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 09:02:31 crc kubenswrapper[4776]: I1208 09:02:31.596639 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbmf6" event={"ID":"bde03b49-eb1e-4941-b49e-e361cb8d83f4","Type":"ContainerStarted","Data":"dcd98ab358772c6b5be8ed5aa6d89ab06dee840684094ed9ee2af6352362d64a"} Dec 08 09:02:31 crc kubenswrapper[4776]: I1208 09:02:31.598623 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx2gt" event={"ID":"3540eb34-736e-422d-b860-99cc44778fad","Type":"ContainerStarted","Data":"a2c6295513762e4a6e96af0ec2951edc9bd3119283cea6a7c339c3da508f1e91"} Dec 08 09:02:31 crc kubenswrapper[4776]: I1208 09:02:31.601230 4776 generic.go:334] "Generic (PLEG): container finished" podID="1122c2d9-2ef0-4527-be3f-5617003d2bc0" containerID="a74de1a7cfdcb0fe7a05e5b6882920042f87c165ea1352b894123e0aa72f9f84" exitCode=0 Dec 08 09:02:31 crc kubenswrapper[4776]: I1208 09:02:31.601256 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kcmf" event={"ID":"1122c2d9-2ef0-4527-be3f-5617003d2bc0","Type":"ContainerDied","Data":"a74de1a7cfdcb0fe7a05e5b6882920042f87c165ea1352b894123e0aa72f9f84"} Dec 08 09:02:32 crc kubenswrapper[4776]: I1208 09:02:32.609791 4776 generic.go:334] "Generic (PLEG): container finished" podID="3540eb34-736e-422d-b860-99cc44778fad" containerID="a2c6295513762e4a6e96af0ec2951edc9bd3119283cea6a7c339c3da508f1e91" exitCode=0 Dec 08 09:02:32 crc kubenswrapper[4776]: I1208 09:02:32.609994 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx2gt" event={"ID":"3540eb34-736e-422d-b860-99cc44778fad","Type":"ContainerDied","Data":"a2c6295513762e4a6e96af0ec2951edc9bd3119283cea6a7c339c3da508f1e91"} Dec 08 09:02:32 crc kubenswrapper[4776]: I1208 09:02:32.612595 4776 generic.go:334] "Generic (PLEG): container finished" podID="bde03b49-eb1e-4941-b49e-e361cb8d83f4" containerID="dcd98ab358772c6b5be8ed5aa6d89ab06dee840684094ed9ee2af6352362d64a" exitCode=0 Dec 08 09:02:32 crc kubenswrapper[4776]: I1208 09:02:32.612643 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbmf6" event={"ID":"bde03b49-eb1e-4941-b49e-e361cb8d83f4","Type":"ContainerDied","Data":"dcd98ab358772c6b5be8ed5aa6d89ab06dee840684094ed9ee2af6352362d64a"} Dec 08 09:02:35 crc kubenswrapper[4776]: I1208 09:02:35.629892 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbmf6" event={"ID":"bde03b49-eb1e-4941-b49e-e361cb8d83f4","Type":"ContainerStarted","Data":"840533b00ccbf019c1b1ad45d63acac9dc28cdd233141ef50008afcdc65278f7"} Dec 08 09:02:35 crc kubenswrapper[4776]: I1208 09:02:35.646639 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fbmf6" podStartSLOduration=3.4852213389999998 podStartE2EDuration="1m14.646620491s" podCreationTimestamp="2025-12-08 09:01:21 +0000 UTC" firstStartedPulling="2025-12-08 09:01:23.929921977 +0000 UTC m=+160.193146999" lastFinishedPulling="2025-12-08 09:02:35.091321129 +0000 UTC m=+231.354546151" observedRunningTime="2025-12-08 09:02:35.645259315 +0000 UTC m=+231.908484357" watchObservedRunningTime="2025-12-08 09:02:35.646620491 +0000 UTC m=+231.909845543" Dec 08 09:02:36 crc kubenswrapper[4776]: I1208 09:02:36.637615 4776 generic.go:334] "Generic (PLEG): container finished" podID="ed9b52d1-9f5b-4d4b-aece-23ced00e6737" containerID="dd28cfde6d1b8d2ee66ab579db16a1b36bdfd9b038b6317e9e23789e5d252ed8" exitCode=0 Dec 08 09:02:36 crc kubenswrapper[4776]: I1208 09:02:36.639058 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7th2w" event={"ID":"ed9b52d1-9f5b-4d4b-aece-23ced00e6737","Type":"ContainerDied","Data":"dd28cfde6d1b8d2ee66ab579db16a1b36bdfd9b038b6317e9e23789e5d252ed8"} Dec 08 09:02:36 crc kubenswrapper[4776]: I1208 09:02:36.644248 4776 generic.go:334] "Generic (PLEG): container finished" podID="9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" containerID="bf0e289111ef12669e0e64d39ade41f882e72a0190c4ea452586a7c9eeb711cb" exitCode=0 Dec 08 09:02:36 crc kubenswrapper[4776]: I1208 09:02:36.644295 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2qsw" event={"ID":"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac","Type":"ContainerDied","Data":"bf0e289111ef12669e0e64d39ade41f882e72a0190c4ea452586a7c9eeb711cb"} Dec 08 09:02:36 crc kubenswrapper[4776]: I1208 09:02:36.646464 4776 generic.go:334] "Generic (PLEG): container finished" podID="b6a339c1-f955-4a08-bba1-0df39a886324" containerID="f4302c6b15eecdb8e2b42d69c6614264c5c70265656ef272d30960e4ae20e4e3" exitCode=0 Dec 08 09:02:36 crc kubenswrapper[4776]: I1208 09:02:36.646536 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swlvb" event={"ID":"b6a339c1-f955-4a08-bba1-0df39a886324","Type":"ContainerDied","Data":"f4302c6b15eecdb8e2b42d69c6614264c5c70265656ef272d30960e4ae20e4e3"} Dec 08 09:02:36 crc kubenswrapper[4776]: I1208 09:02:36.651092 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx2gt" event={"ID":"3540eb34-736e-422d-b860-99cc44778fad","Type":"ContainerStarted","Data":"70dba1b90c674fde958b046bed17d12d734113018efa824c329604bd2f4ff579"} Dec 08 09:02:36 crc kubenswrapper[4776]: I1208 09:02:36.653036 4776 generic.go:334] "Generic (PLEG): container finished" podID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" containerID="dab5da70f4c9237ad708897be0beb47cafc2bdfd4d5ee3c0ca304caa6f3729df" exitCode=0 Dec 08 09:02:36 crc kubenswrapper[4776]: I1208 09:02:36.653248 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghlj2" event={"ID":"58f8ec13-d004-4d1f-8a44-f325c07c25fd","Type":"ContainerDied","Data":"dab5da70f4c9237ad708897be0beb47cafc2bdfd4d5ee3c0ca304caa6f3729df"} Dec 08 09:02:36 crc kubenswrapper[4776]: I1208 09:02:36.655778 4776 generic.go:334] "Generic (PLEG): container finished" podID="0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" containerID="7e3e905b36fd827d475beae9e48acd14ce7d0d60978ee57c529179a7bad0646d" exitCode=0 Dec 08 09:02:36 crc kubenswrapper[4776]: I1208 09:02:36.655840 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82j27" event={"ID":"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9","Type":"ContainerDied","Data":"7e3e905b36fd827d475beae9e48acd14ce7d0d60978ee57c529179a7bad0646d"} Dec 08 09:02:36 crc kubenswrapper[4776]: I1208 09:02:36.673322 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kcmf" event={"ID":"1122c2d9-2ef0-4527-be3f-5617003d2bc0","Type":"ContainerStarted","Data":"3c3aef931e2786ce259ef07fb205acc6d62c8d53e708f7b3ff8d3a8e5fa0ca13"} Dec 08 09:02:36 crc kubenswrapper[4776]: I1208 09:02:36.769965 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mx2gt" podStartSLOduration=3.454114663 podStartE2EDuration="1m14.769940285s" podCreationTimestamp="2025-12-08 09:01:22 +0000 UTC" firstStartedPulling="2025-12-08 09:01:23.922280183 +0000 UTC m=+160.185505205" lastFinishedPulling="2025-12-08 09:02:35.238105805 +0000 UTC m=+231.501330827" observedRunningTime="2025-12-08 09:02:36.735586456 +0000 UTC m=+232.998811478" watchObservedRunningTime="2025-12-08 09:02:36.769940285 +0000 UTC m=+233.033165307" Dec 08 09:02:36 crc kubenswrapper[4776]: I1208 09:02:36.828316 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8kcmf" podStartSLOduration=4.557217274 podStartE2EDuration="1m17.828297975s" podCreationTimestamp="2025-12-08 09:01:19 +0000 UTC" firstStartedPulling="2025-12-08 09:01:21.881583309 +0000 UTC m=+158.144808321" lastFinishedPulling="2025-12-08 09:02:35.152664 +0000 UTC m=+231.415889022" observedRunningTime="2025-12-08 09:02:36.824533825 +0000 UTC m=+233.087758847" watchObservedRunningTime="2025-12-08 09:02:36.828297975 +0000 UTC m=+233.091522997" Dec 08 09:02:39 crc kubenswrapper[4776]: I1208 09:02:39.610925 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:02:39 crc kubenswrapper[4776]: I1208 09:02:39.611425 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:02:39 crc kubenswrapper[4776]: I1208 09:02:39.690396 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7th2w" event={"ID":"ed9b52d1-9f5b-4d4b-aece-23ced00e6737","Type":"ContainerStarted","Data":"fe1c981c3d9e1c02f9263c76ca5911b9a28ddf7d08c9433462e45932d21b3037"} Dec 08 09:02:39 crc kubenswrapper[4776]: I1208 09:02:39.692963 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2qsw" event={"ID":"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac","Type":"ContainerStarted","Data":"9636a56e89e398e2143e9da6b6d1d18b4704c2e92b280289711701f9f8cf8881"} Dec 08 09:02:39 crc kubenswrapper[4776]: I1208 09:02:39.696079 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swlvb" event={"ID":"b6a339c1-f955-4a08-bba1-0df39a886324","Type":"ContainerStarted","Data":"baaa60cadffadcf80c904850964a4f0d880a28ef8278ad5161844973525e5b09"} Dec 08 09:02:39 crc kubenswrapper[4776]: I1208 09:02:39.698137 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghlj2" event={"ID":"58f8ec13-d004-4d1f-8a44-f325c07c25fd","Type":"ContainerStarted","Data":"f6ab0a6bea35942ed629a9a0dc6dfcd0ad8958adebafdc6674279893f89dbb47"} Dec 08 09:02:39 crc kubenswrapper[4776]: I1208 09:02:39.700662 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82j27" event={"ID":"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9","Type":"ContainerStarted","Data":"920717a9fa8482ea3eb51645c7938b76c2f77c2ff318e53fd9fa1d5fcc81ff83"} Dec 08 09:02:39 crc kubenswrapper[4776]: I1208 09:02:39.711559 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7th2w" podStartSLOduration=3.740702231 podStartE2EDuration="1m20.71153848s" podCreationTimestamp="2025-12-08 09:01:19 +0000 UTC" firstStartedPulling="2025-12-08 09:01:21.811072148 +0000 UTC m=+158.074297170" lastFinishedPulling="2025-12-08 09:02:38.781908387 +0000 UTC m=+235.045133419" observedRunningTime="2025-12-08 09:02:39.710707017 +0000 UTC m=+235.973932049" watchObservedRunningTime="2025-12-08 09:02:39.71153848 +0000 UTC m=+235.974763502" Dec 08 09:02:39 crc kubenswrapper[4776]: I1208 09:02:39.735800 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-swlvb" podStartSLOduration=5.052294675 podStartE2EDuration="1m21.735779268s" podCreationTimestamp="2025-12-08 09:01:18 +0000 UTC" firstStartedPulling="2025-12-08 09:01:21.859498443 +0000 UTC m=+158.122723475" lastFinishedPulling="2025-12-08 09:02:38.542983046 +0000 UTC m=+234.806208068" observedRunningTime="2025-12-08 09:02:39.733123437 +0000 UTC m=+235.996348459" watchObservedRunningTime="2025-12-08 09:02:39.735779268 +0000 UTC m=+235.999004290" Dec 08 09:02:39 crc kubenswrapper[4776]: I1208 09:02:39.736953 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:02:39 crc kubenswrapper[4776]: I1208 09:02:39.754288 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-82j27" podStartSLOduration=3.081979946 podStartE2EDuration="1m19.754259273s" podCreationTimestamp="2025-12-08 09:01:20 +0000 UTC" firstStartedPulling="2025-12-08 09:01:21.874591803 +0000 UTC m=+158.137816825" lastFinishedPulling="2025-12-08 09:02:38.54687114 +0000 UTC m=+234.810096152" observedRunningTime="2025-12-08 09:02:39.752225028 +0000 UTC m=+236.015450050" watchObservedRunningTime="2025-12-08 09:02:39.754259273 +0000 UTC m=+236.017484295" Dec 08 09:02:39 crc kubenswrapper[4776]: I1208 09:02:39.784878 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z2qsw" podStartSLOduration=5.01699751 podStartE2EDuration="1m21.784862661s" podCreationTimestamp="2025-12-08 09:01:18 +0000 UTC" firstStartedPulling="2025-12-08 09:01:21.819973134 +0000 UTC m=+158.083198156" lastFinishedPulling="2025-12-08 09:02:38.587838285 +0000 UTC m=+234.851063307" observedRunningTime="2025-12-08 09:02:39.783574546 +0000 UTC m=+236.046799568" watchObservedRunningTime="2025-12-08 09:02:39.784862661 +0000 UTC m=+236.048087683" Dec 08 09:02:39 crc kubenswrapper[4776]: I1208 09:02:39.805883 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ghlj2" podStartSLOduration=4.104258565 podStartE2EDuration="1m18.805863872s" podCreationTimestamp="2025-12-08 09:01:21 +0000 UTC" firstStartedPulling="2025-12-08 09:01:23.927089761 +0000 UTC m=+160.190314783" lastFinishedPulling="2025-12-08 09:02:38.628695068 +0000 UTC m=+234.891920090" observedRunningTime="2025-12-08 09:02:39.804398594 +0000 UTC m=+236.067623616" watchObservedRunningTime="2025-12-08 09:02:39.805863872 +0000 UTC m=+236.069088894" Dec 08 09:02:41 crc kubenswrapper[4776]: I1208 09:02:41.186853 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:02:41 crc kubenswrapper[4776]: I1208 09:02:41.186908 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:02:41 crc kubenswrapper[4776]: I1208 09:02:41.223549 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:02:41 crc kubenswrapper[4776]: I1208 09:02:41.585764 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:02:41 crc kubenswrapper[4776]: I1208 09:02:41.586707 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:02:41 crc kubenswrapper[4776]: I1208 09:02:41.628958 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:02:42 crc kubenswrapper[4776]: I1208 09:02:42.324766 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:02:42 crc kubenswrapper[4776]: I1208 09:02:42.325520 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:02:42 crc kubenswrapper[4776]: I1208 09:02:42.365454 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:02:42 crc kubenswrapper[4776]: I1208 09:02:42.586573 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:02:42 crc kubenswrapper[4776]: I1208 09:02:42.586855 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:02:42 crc kubenswrapper[4776]: I1208 09:02:42.620552 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:02:42 crc kubenswrapper[4776]: I1208 09:02:42.758351 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:02:42 crc kubenswrapper[4776]: I1208 09:02:42.760458 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:02:45 crc kubenswrapper[4776]: I1208 09:02:45.579400 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mx2gt"] Dec 08 09:02:45 crc kubenswrapper[4776]: I1208 09:02:45.734070 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mx2gt" podUID="3540eb34-736e-422d-b860-99cc44778fad" containerName="registry-server" containerID="cri-o://70dba1b90c674fde958b046bed17d12d734113018efa824c329604bd2f4ff579" gracePeriod=2 Dec 08 09:02:48 crc kubenswrapper[4776]: I1208 09:02:48.991417 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:02:48 crc kubenswrapper[4776]: I1208 09:02:48.991969 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:02:49 crc kubenswrapper[4776]: I1208 09:02:49.050164 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:02:49 crc kubenswrapper[4776]: I1208 09:02:49.212088 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:02:49 crc kubenswrapper[4776]: I1208 09:02:49.212130 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:02:49 crc kubenswrapper[4776]: I1208 09:02:49.254922 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:02:49 crc kubenswrapper[4776]: I1208 09:02:49.392771 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:02:49 crc kubenswrapper[4776]: I1208 09:02:49.392817 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:02:49 crc kubenswrapper[4776]: I1208 09:02:49.454663 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:02:49 crc kubenswrapper[4776]: I1208 09:02:49.646326 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:02:49 crc kubenswrapper[4776]: I1208 09:02:49.788882 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:02:49 crc kubenswrapper[4776]: I1208 09:02:49.790485 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:02:49 crc kubenswrapper[4776]: I1208 09:02:49.794447 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:02:51 crc kubenswrapper[4776]: I1208 09:02:51.077800 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8kcmf"] Dec 08 09:02:51 crc kubenswrapper[4776]: I1208 09:02:51.078256 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8kcmf" podUID="1122c2d9-2ef0-4527-be3f-5617003d2bc0" containerName="registry-server" containerID="cri-o://3c3aef931e2786ce259ef07fb205acc6d62c8d53e708f7b3ff8d3a8e5fa0ca13" gracePeriod=2 Dec 08 09:02:51 crc kubenswrapper[4776]: I1208 09:02:51.517461 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:02:51 crc kubenswrapper[4776]: I1208 09:02:51.637985 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2hsh8"] Dec 08 09:02:51 crc kubenswrapper[4776]: I1208 09:02:51.667209 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:02:51 crc kubenswrapper[4776]: I1208 09:02:51.683556 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7th2w"] Dec 08 09:02:51 crc kubenswrapper[4776]: I1208 09:02:51.993063 4776 generic.go:334] "Generic (PLEG): container finished" podID="3540eb34-736e-422d-b860-99cc44778fad" containerID="70dba1b90c674fde958b046bed17d12d734113018efa824c329604bd2f4ff579" exitCode=0 Dec 08 09:02:51 crc kubenswrapper[4776]: I1208 09:02:51.993270 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx2gt" event={"ID":"3540eb34-736e-422d-b860-99cc44778fad","Type":"ContainerDied","Data":"70dba1b90c674fde958b046bed17d12d734113018efa824c329604bd2f4ff579"} Dec 08 09:02:52 crc kubenswrapper[4776]: E1208 09:02:52.592580 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 70dba1b90c674fde958b046bed17d12d734113018efa824c329604bd2f4ff579 is running failed: container process not found" containerID="70dba1b90c674fde958b046bed17d12d734113018efa824c329604bd2f4ff579" cmd=["grpc_health_probe","-addr=:50051"] Dec 08 09:02:52 crc kubenswrapper[4776]: E1208 09:02:52.593131 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 70dba1b90c674fde958b046bed17d12d734113018efa824c329604bd2f4ff579 is running failed: container process not found" containerID="70dba1b90c674fde958b046bed17d12d734113018efa824c329604bd2f4ff579" cmd=["grpc_health_probe","-addr=:50051"] Dec 08 09:02:52 crc kubenswrapper[4776]: E1208 09:02:52.593401 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 70dba1b90c674fde958b046bed17d12d734113018efa824c329604bd2f4ff579 is running failed: container process not found" containerID="70dba1b90c674fde958b046bed17d12d734113018efa824c329604bd2f4ff579" cmd=["grpc_health_probe","-addr=:50051"] Dec 08 09:02:52 crc kubenswrapper[4776]: E1208 09:02:52.593476 4776 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 70dba1b90c674fde958b046bed17d12d734113018efa824c329604bd2f4ff579 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-mx2gt" podUID="3540eb34-736e-422d-b860-99cc44778fad" containerName="registry-server" Dec 08 09:02:52 crc kubenswrapper[4776]: I1208 09:02:52.623325 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:02:52 crc kubenswrapper[4776]: I1208 09:02:52.773565 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9tfc\" (UniqueName: \"kubernetes.io/projected/3540eb34-736e-422d-b860-99cc44778fad-kube-api-access-j9tfc\") pod \"3540eb34-736e-422d-b860-99cc44778fad\" (UID: \"3540eb34-736e-422d-b860-99cc44778fad\") " Dec 08 09:02:52 crc kubenswrapper[4776]: I1208 09:02:52.773616 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3540eb34-736e-422d-b860-99cc44778fad-utilities\") pod \"3540eb34-736e-422d-b860-99cc44778fad\" (UID: \"3540eb34-736e-422d-b860-99cc44778fad\") " Dec 08 09:02:52 crc kubenswrapper[4776]: I1208 09:02:52.773725 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3540eb34-736e-422d-b860-99cc44778fad-catalog-content\") pod \"3540eb34-736e-422d-b860-99cc44778fad\" (UID: \"3540eb34-736e-422d-b860-99cc44778fad\") " Dec 08 09:02:52 crc kubenswrapper[4776]: I1208 09:02:52.774610 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3540eb34-736e-422d-b860-99cc44778fad-utilities" (OuterVolumeSpecName: "utilities") pod "3540eb34-736e-422d-b860-99cc44778fad" (UID: "3540eb34-736e-422d-b860-99cc44778fad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:02:52 crc kubenswrapper[4776]: I1208 09:02:52.781257 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3540eb34-736e-422d-b860-99cc44778fad-kube-api-access-j9tfc" (OuterVolumeSpecName: "kube-api-access-j9tfc") pod "3540eb34-736e-422d-b860-99cc44778fad" (UID: "3540eb34-736e-422d-b860-99cc44778fad"). InnerVolumeSpecName "kube-api-access-j9tfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:02:52 crc kubenswrapper[4776]: I1208 09:02:52.877905 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9tfc\" (UniqueName: \"kubernetes.io/projected/3540eb34-736e-422d-b860-99cc44778fad-kube-api-access-j9tfc\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:52 crc kubenswrapper[4776]: I1208 09:02:52.877949 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3540eb34-736e-422d-b860-99cc44778fad-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:52 crc kubenswrapper[4776]: I1208 09:02:52.910200 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3540eb34-736e-422d-b860-99cc44778fad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3540eb34-736e-422d-b860-99cc44778fad" (UID: "3540eb34-736e-422d-b860-99cc44778fad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:02:52 crc kubenswrapper[4776]: I1208 09:02:52.979418 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3540eb34-736e-422d-b860-99cc44778fad-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:52 crc kubenswrapper[4776]: I1208 09:02:52.999435 4776 generic.go:334] "Generic (PLEG): container finished" podID="1122c2d9-2ef0-4527-be3f-5617003d2bc0" containerID="3c3aef931e2786ce259ef07fb205acc6d62c8d53e708f7b3ff8d3a8e5fa0ca13" exitCode=0 Dec 08 09:02:52 crc kubenswrapper[4776]: I1208 09:02:52.999494 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kcmf" event={"ID":"1122c2d9-2ef0-4527-be3f-5617003d2bc0","Type":"ContainerDied","Data":"3c3aef931e2786ce259ef07fb205acc6d62c8d53e708f7b3ff8d3a8e5fa0ca13"} Dec 08 09:02:52 crc kubenswrapper[4776]: I1208 09:02:52.999522 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kcmf" event={"ID":"1122c2d9-2ef0-4527-be3f-5617003d2bc0","Type":"ContainerDied","Data":"b03d54ba0075762765111878e99480832f463d0d232529616791d8aee2c46c44"} Dec 08 09:02:52 crc kubenswrapper[4776]: I1208 09:02:52.999535 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b03d54ba0075762765111878e99480832f463d0d232529616791d8aee2c46c44" Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.001641 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7th2w" podUID="ed9b52d1-9f5b-4d4b-aece-23ced00e6737" containerName="registry-server" containerID="cri-o://fe1c981c3d9e1c02f9263c76ca5911b9a28ddf7d08c9433462e45932d21b3037" gracePeriod=2 Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.001948 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mx2gt" Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.003054 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx2gt" event={"ID":"3540eb34-736e-422d-b860-99cc44778fad","Type":"ContainerDied","Data":"b905e23469e1eaf9fdbea3d3b1246bc6659634643af0c05285dc26b9e40fae78"} Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.003313 4776 scope.go:117] "RemoveContainer" containerID="70dba1b90c674fde958b046bed17d12d734113018efa824c329604bd2f4ff579" Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.014446 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.021145 4776 scope.go:117] "RemoveContainer" containerID="a2c6295513762e4a6e96af0ec2951edc9bd3119283cea6a7c339c3da508f1e91" Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.047451 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mx2gt"] Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.053075 4776 scope.go:117] "RemoveContainer" containerID="e8cd980527a695530adcca26eda2c33a0aef44e5405ca49af61bcfcd8a806555" Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.055757 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mx2gt"] Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.080785 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1122c2d9-2ef0-4527-be3f-5617003d2bc0-utilities\") pod \"1122c2d9-2ef0-4527-be3f-5617003d2bc0\" (UID: \"1122c2d9-2ef0-4527-be3f-5617003d2bc0\") " Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.080901 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1122c2d9-2ef0-4527-be3f-5617003d2bc0-catalog-content\") pod \"1122c2d9-2ef0-4527-be3f-5617003d2bc0\" (UID: \"1122c2d9-2ef0-4527-be3f-5617003d2bc0\") " Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.080954 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v44b\" (UniqueName: \"kubernetes.io/projected/1122c2d9-2ef0-4527-be3f-5617003d2bc0-kube-api-access-5v44b\") pod \"1122c2d9-2ef0-4527-be3f-5617003d2bc0\" (UID: \"1122c2d9-2ef0-4527-be3f-5617003d2bc0\") " Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.084647 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1122c2d9-2ef0-4527-be3f-5617003d2bc0-utilities" (OuterVolumeSpecName: "utilities") pod "1122c2d9-2ef0-4527-be3f-5617003d2bc0" (UID: "1122c2d9-2ef0-4527-be3f-5617003d2bc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.101665 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1122c2d9-2ef0-4527-be3f-5617003d2bc0-kube-api-access-5v44b" (OuterVolumeSpecName: "kube-api-access-5v44b") pod "1122c2d9-2ef0-4527-be3f-5617003d2bc0" (UID: "1122c2d9-2ef0-4527-be3f-5617003d2bc0"). InnerVolumeSpecName "kube-api-access-5v44b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.150216 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1122c2d9-2ef0-4527-be3f-5617003d2bc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1122c2d9-2ef0-4527-be3f-5617003d2bc0" (UID: "1122c2d9-2ef0-4527-be3f-5617003d2bc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.183039 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1122c2d9-2ef0-4527-be3f-5617003d2bc0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.183074 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v44b\" (UniqueName: \"kubernetes.io/projected/1122c2d9-2ef0-4527-be3f-5617003d2bc0-kube-api-access-5v44b\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:53 crc kubenswrapper[4776]: I1208 09:02:53.183088 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1122c2d9-2ef0-4527-be3f-5617003d2bc0-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:54 crc kubenswrapper[4776]: I1208 09:02:54.007393 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8kcmf" Dec 08 09:02:54 crc kubenswrapper[4776]: I1208 09:02:54.032634 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8kcmf"] Dec 08 09:02:54 crc kubenswrapper[4776]: I1208 09:02:54.034292 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8kcmf"] Dec 08 09:02:54 crc kubenswrapper[4776]: I1208 09:02:54.352846 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1122c2d9-2ef0-4527-be3f-5617003d2bc0" path="/var/lib/kubelet/pods/1122c2d9-2ef0-4527-be3f-5617003d2bc0/volumes" Dec 08 09:02:54 crc kubenswrapper[4776]: I1208 09:02:54.353689 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3540eb34-736e-422d-b860-99cc44778fad" path="/var/lib/kubelet/pods/3540eb34-736e-422d-b860-99cc44778fad/volumes" Dec 08 09:02:54 crc kubenswrapper[4776]: I1208 09:02:54.481948 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghlj2"] Dec 08 09:02:54 crc kubenswrapper[4776]: I1208 09:02:54.482240 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ghlj2" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" containerName="registry-server" containerID="cri-o://f6ab0a6bea35942ed629a9a0dc6dfcd0ad8958adebafdc6674279893f89dbb47" gracePeriod=2 Dec 08 09:02:54 crc kubenswrapper[4776]: I1208 09:02:54.921008 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.004808 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xntmr\" (UniqueName: \"kubernetes.io/projected/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-kube-api-access-xntmr\") pod \"ed9b52d1-9f5b-4d4b-aece-23ced00e6737\" (UID: \"ed9b52d1-9f5b-4d4b-aece-23ced00e6737\") " Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.004867 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-catalog-content\") pod \"ed9b52d1-9f5b-4d4b-aece-23ced00e6737\" (UID: \"ed9b52d1-9f5b-4d4b-aece-23ced00e6737\") " Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.004905 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-utilities\") pod \"ed9b52d1-9f5b-4d4b-aece-23ced00e6737\" (UID: \"ed9b52d1-9f5b-4d4b-aece-23ced00e6737\") " Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.005792 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-utilities" (OuterVolumeSpecName: "utilities") pod "ed9b52d1-9f5b-4d4b-aece-23ced00e6737" (UID: "ed9b52d1-9f5b-4d4b-aece-23ced00e6737"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.008074 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-kube-api-access-xntmr" (OuterVolumeSpecName: "kube-api-access-xntmr") pod "ed9b52d1-9f5b-4d4b-aece-23ced00e6737" (UID: "ed9b52d1-9f5b-4d4b-aece-23ced00e6737"). InnerVolumeSpecName "kube-api-access-xntmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.015458 4776 generic.go:334] "Generic (PLEG): container finished" podID="ed9b52d1-9f5b-4d4b-aece-23ced00e6737" containerID="fe1c981c3d9e1c02f9263c76ca5911b9a28ddf7d08c9433462e45932d21b3037" exitCode=0 Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.015585 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7th2w" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.015934 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7th2w" event={"ID":"ed9b52d1-9f5b-4d4b-aece-23ced00e6737","Type":"ContainerDied","Data":"fe1c981c3d9e1c02f9263c76ca5911b9a28ddf7d08c9433462e45932d21b3037"} Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.015962 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7th2w" event={"ID":"ed9b52d1-9f5b-4d4b-aece-23ced00e6737","Type":"ContainerDied","Data":"daafb80f6925fef37cd042bceab0c4d5c14e1f27dab755f1738347ee179b1dd8"} Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.015978 4776 scope.go:117] "RemoveContainer" containerID="fe1c981c3d9e1c02f9263c76ca5911b9a28ddf7d08c9433462e45932d21b3037" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.018910 4776 generic.go:334] "Generic (PLEG): container finished" podID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" containerID="f6ab0a6bea35942ed629a9a0dc6dfcd0ad8958adebafdc6674279893f89dbb47" exitCode=0 Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.018950 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghlj2" event={"ID":"58f8ec13-d004-4d1f-8a44-f325c07c25fd","Type":"ContainerDied","Data":"f6ab0a6bea35942ed629a9a0dc6dfcd0ad8958adebafdc6674279893f89dbb47"} Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.031863 4776 scope.go:117] "RemoveContainer" containerID="dd28cfde6d1b8d2ee66ab579db16a1b36bdfd9b038b6317e9e23789e5d252ed8" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.047617 4776 scope.go:117] "RemoveContainer" containerID="85b27af181d7ca69be29673dee0432ee4c99fb6a26722f7c1100ad43fd5922d9" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.051226 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed9b52d1-9f5b-4d4b-aece-23ced00e6737" (UID: "ed9b52d1-9f5b-4d4b-aece-23ced00e6737"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.065296 4776 scope.go:117] "RemoveContainer" containerID="fe1c981c3d9e1c02f9263c76ca5911b9a28ddf7d08c9433462e45932d21b3037" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.066016 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe1c981c3d9e1c02f9263c76ca5911b9a28ddf7d08c9433462e45932d21b3037\": container with ID starting with fe1c981c3d9e1c02f9263c76ca5911b9a28ddf7d08c9433462e45932d21b3037 not found: ID does not exist" containerID="fe1c981c3d9e1c02f9263c76ca5911b9a28ddf7d08c9433462e45932d21b3037" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.066053 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe1c981c3d9e1c02f9263c76ca5911b9a28ddf7d08c9433462e45932d21b3037"} err="failed to get container status \"fe1c981c3d9e1c02f9263c76ca5911b9a28ddf7d08c9433462e45932d21b3037\": rpc error: code = NotFound desc = could not find container \"fe1c981c3d9e1c02f9263c76ca5911b9a28ddf7d08c9433462e45932d21b3037\": container with ID starting with fe1c981c3d9e1c02f9263c76ca5911b9a28ddf7d08c9433462e45932d21b3037 not found: ID does not exist" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.066091 4776 scope.go:117] "RemoveContainer" containerID="dd28cfde6d1b8d2ee66ab579db16a1b36bdfd9b038b6317e9e23789e5d252ed8" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.066500 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd28cfde6d1b8d2ee66ab579db16a1b36bdfd9b038b6317e9e23789e5d252ed8\": container with ID starting with dd28cfde6d1b8d2ee66ab579db16a1b36bdfd9b038b6317e9e23789e5d252ed8 not found: ID does not exist" containerID="dd28cfde6d1b8d2ee66ab579db16a1b36bdfd9b038b6317e9e23789e5d252ed8" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.066548 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd28cfde6d1b8d2ee66ab579db16a1b36bdfd9b038b6317e9e23789e5d252ed8"} err="failed to get container status \"dd28cfde6d1b8d2ee66ab579db16a1b36bdfd9b038b6317e9e23789e5d252ed8\": rpc error: code = NotFound desc = could not find container \"dd28cfde6d1b8d2ee66ab579db16a1b36bdfd9b038b6317e9e23789e5d252ed8\": container with ID starting with dd28cfde6d1b8d2ee66ab579db16a1b36bdfd9b038b6317e9e23789e5d252ed8 not found: ID does not exist" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.066578 4776 scope.go:117] "RemoveContainer" containerID="85b27af181d7ca69be29673dee0432ee4c99fb6a26722f7c1100ad43fd5922d9" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.067067 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85b27af181d7ca69be29673dee0432ee4c99fb6a26722f7c1100ad43fd5922d9\": container with ID starting with 85b27af181d7ca69be29673dee0432ee4c99fb6a26722f7c1100ad43fd5922d9 not found: ID does not exist" containerID="85b27af181d7ca69be29673dee0432ee4c99fb6a26722f7c1100ad43fd5922d9" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.067098 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b27af181d7ca69be29673dee0432ee4c99fb6a26722f7c1100ad43fd5922d9"} err="failed to get container status \"85b27af181d7ca69be29673dee0432ee4c99fb6a26722f7c1100ad43fd5922d9\": rpc error: code = NotFound desc = could not find container \"85b27af181d7ca69be29673dee0432ee4c99fb6a26722f7c1100ad43fd5922d9\": container with ID starting with 85b27af181d7ca69be29673dee0432ee4c99fb6a26722f7c1100ad43fd5922d9 not found: ID does not exist" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.105780 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.105818 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.105830 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xntmr\" (UniqueName: \"kubernetes.io/projected/ed9b52d1-9f5b-4d4b-aece-23ced00e6737-kube-api-access-xntmr\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.343206 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7th2w"] Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.345624 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7th2w"] Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.464224 4776 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.464439 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed9b52d1-9f5b-4d4b-aece-23ced00e6737" containerName="registry-server" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.464451 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9b52d1-9f5b-4d4b-aece-23ced00e6737" containerName="registry-server" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.464464 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1122c2d9-2ef0-4527-be3f-5617003d2bc0" containerName="registry-server" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.464470 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1122c2d9-2ef0-4527-be3f-5617003d2bc0" containerName="registry-server" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.464479 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1122c2d9-2ef0-4527-be3f-5617003d2bc0" containerName="extract-utilities" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.464485 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1122c2d9-2ef0-4527-be3f-5617003d2bc0" containerName="extract-utilities" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.464502 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3540eb34-736e-422d-b860-99cc44778fad" containerName="registry-server" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.464507 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3540eb34-736e-422d-b860-99cc44778fad" containerName="registry-server" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.464515 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed9b52d1-9f5b-4d4b-aece-23ced00e6737" containerName="extract-utilities" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.464521 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9b52d1-9f5b-4d4b-aece-23ced00e6737" containerName="extract-utilities" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.464530 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90" containerName="pruner" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.464536 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90" containerName="pruner" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.464545 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3540eb34-736e-422d-b860-99cc44778fad" containerName="extract-content" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.464550 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3540eb34-736e-422d-b860-99cc44778fad" containerName="extract-content" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.464559 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1122c2d9-2ef0-4527-be3f-5617003d2bc0" containerName="extract-content" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.464564 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1122c2d9-2ef0-4527-be3f-5617003d2bc0" containerName="extract-content" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.464572 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3540eb34-736e-422d-b860-99cc44778fad" containerName="extract-utilities" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.464577 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3540eb34-736e-422d-b860-99cc44778fad" containerName="extract-utilities" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.464586 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed9b52d1-9f5b-4d4b-aece-23ced00e6737" containerName="extract-content" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.464591 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9b52d1-9f5b-4d4b-aece-23ced00e6737" containerName="extract-content" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.464680 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e4d9ac1-0fc3-4c58-8c6a-99c9653e4a90" containerName="pruner" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.464691 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="3540eb34-736e-422d-b860-99cc44778fad" containerName="registry-server" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.464702 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1122c2d9-2ef0-4527-be3f-5617003d2bc0" containerName="registry-server" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.464712 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed9b52d1-9f5b-4d4b-aece-23ced00e6737" containerName="registry-server" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.465026 4776 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.465277 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144" gracePeriod=15 Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.465417 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.465727 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008" gracePeriod=15 Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.465770 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3" gracePeriod=15 Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.465803 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe" gracePeriod=15 Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.465835 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608" gracePeriod=15 Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.466313 4776 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.466765 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.466790 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.466807 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.466816 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.466829 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.466838 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.466850 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.466857 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.466869 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.466876 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.466888 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.466895 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.467010 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.467025 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.467034 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.467048 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.467059 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.552610 4776 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.82:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.563387 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.563980 4776 status_manager.go:851] "Failed to get status for pod" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" pod="openshift-marketplace/redhat-marketplace-ghlj2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ghlj2\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.564396 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.610770 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.610817 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.610843 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.610860 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.610877 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.610897 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.610933 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.610954 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.712225 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddw2s\" (UniqueName: \"kubernetes.io/projected/58f8ec13-d004-4d1f-8a44-f325c07c25fd-kube-api-access-ddw2s\") pod \"58f8ec13-d004-4d1f-8a44-f325c07c25fd\" (UID: \"58f8ec13-d004-4d1f-8a44-f325c07c25fd\") " Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.712263 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f8ec13-d004-4d1f-8a44-f325c07c25fd-utilities\") pod \"58f8ec13-d004-4d1f-8a44-f325c07c25fd\" (UID: \"58f8ec13-d004-4d1f-8a44-f325c07c25fd\") " Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.712328 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f8ec13-d004-4d1f-8a44-f325c07c25fd-catalog-content\") pod \"58f8ec13-d004-4d1f-8a44-f325c07c25fd\" (UID: \"58f8ec13-d004-4d1f-8a44-f325c07c25fd\") " Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.712516 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.712541 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.712562 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.712580 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.712595 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.712617 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.712652 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.712668 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.712736 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.713291 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.713319 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.713361 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.713389 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.713412 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.713414 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.713475 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.713809 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58f8ec13-d004-4d1f-8a44-f325c07c25fd-utilities" (OuterVolumeSpecName: "utilities") pod "58f8ec13-d004-4d1f-8a44-f325c07c25fd" (UID: "58f8ec13-d004-4d1f-8a44-f325c07c25fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.720647 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58f8ec13-d004-4d1f-8a44-f325c07c25fd-kube-api-access-ddw2s" (OuterVolumeSpecName: "kube-api-access-ddw2s") pod "58f8ec13-d004-4d1f-8a44-f325c07c25fd" (UID: "58f8ec13-d004-4d1f-8a44-f325c07c25fd"). InnerVolumeSpecName "kube-api-access-ddw2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.731571 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58f8ec13-d004-4d1f-8a44-f325c07c25fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58f8ec13-d004-4d1f-8a44-f325c07c25fd" (UID: "58f8ec13-d004-4d1f-8a44-f325c07c25fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.813292 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddw2s\" (UniqueName: \"kubernetes.io/projected/58f8ec13-d004-4d1f-8a44-f325c07c25fd-kube-api-access-ddw2s\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.813322 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f8ec13-d004-4d1f-8a44-f325c07c25fd-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.813331 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f8ec13-d004-4d1f-8a44-f325c07c25fd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.821871 4776 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.822327 4776 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.822734 4776 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.822979 4776 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.823318 4776 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.823346 4776 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.823783 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="200ms" Dec 08 09:02:55 crc kubenswrapper[4776]: I1208 09:02:55.854158 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:55 crc kubenswrapper[4776]: W1208 09:02:55.872504 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-3e565029b53848971569ccff0937baf0f3b696a08f9514564c28408192105d22 WatchSource:0}: Error finding container 3e565029b53848971569ccff0937baf0f3b696a08f9514564c28408192105d22: Status 404 returned error can't find the container with id 3e565029b53848971569ccff0937baf0f3b696a08f9514564c28408192105d22 Dec 08 09:02:55 crc kubenswrapper[4776]: E1208 09:02:55.875095 4776 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.82:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187f3204f8039c54 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-08 09:02:55.874767956 +0000 UTC m=+252.137992978,LastTimestamp:2025-12-08 09:02:55.874767956 +0000 UTC m=+252.137992978,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 08 09:02:56 crc kubenswrapper[4776]: E1208 09:02:56.024376 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="400ms" Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.027625 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.028219 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008" exitCode=0 Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.028244 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3" exitCode=0 Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.028255 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe" exitCode=0 Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.028266 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608" exitCode=2 Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.032956 4776 generic.go:334] "Generic (PLEG): container finished" podID="2345c2a2-29d5-4542-b83f-7e57fcd16d77" containerID="742fe1e7633bc3057c42bc9e6c5cb47c25d53af64951671d9b99674680e1972a" exitCode=0 Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.033041 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"2345c2a2-29d5-4542-b83f-7e57fcd16d77","Type":"ContainerDied","Data":"742fe1e7633bc3057c42bc9e6c5cb47c25d53af64951671d9b99674680e1972a"} Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.033834 4776 status_manager.go:851] "Failed to get status for pod" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" pod="openshift-marketplace/redhat-marketplace-ghlj2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ghlj2\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.034138 4776 status_manager.go:851] "Failed to get status for pod" podUID="2345c2a2-29d5-4542-b83f-7e57fcd16d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.034438 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.035596 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghlj2" event={"ID":"58f8ec13-d004-4d1f-8a44-f325c07c25fd","Type":"ContainerDied","Data":"aea0d8c2c31b696ad870686b9fca8ef8ba874325e3ed7193a3e6febc0e3111c3"} Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.035634 4776 scope.go:117] "RemoveContainer" containerID="f6ab0a6bea35942ed629a9a0dc6dfcd0ad8958adebafdc6674279893f89dbb47" Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.035752 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghlj2" Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.036440 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.036733 4776 status_manager.go:851] "Failed to get status for pod" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" pod="openshift-marketplace/redhat-marketplace-ghlj2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ghlj2\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.036991 4776 status_manager.go:851] "Failed to get status for pod" podUID="2345c2a2-29d5-4542-b83f-7e57fcd16d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.037144 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"3e565029b53848971569ccff0937baf0f3b696a08f9514564c28408192105d22"} Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.059735 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.059894 4776 scope.go:117] "RemoveContainer" containerID="dab5da70f4c9237ad708897be0beb47cafc2bdfd4d5ee3c0ca304caa6f3729df" Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.060311 4776 status_manager.go:851] "Failed to get status for pod" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" pod="openshift-marketplace/redhat-marketplace-ghlj2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ghlj2\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.060748 4776 status_manager.go:851] "Failed to get status for pod" podUID="2345c2a2-29d5-4542-b83f-7e57fcd16d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.107093 4776 scope.go:117] "RemoveContainer" containerID="ce50275cd50f4268a44d85f21becbf48fb418d6143c57e75d3493d3d1c2ecd4d" Dec 08 09:02:56 crc kubenswrapper[4776]: I1208 09:02:56.354119 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed9b52d1-9f5b-4d4b-aece-23ced00e6737" path="/var/lib/kubelet/pods/ed9b52d1-9f5b-4d4b-aece-23ced00e6737/volumes" Dec 08 09:02:56 crc kubenswrapper[4776]: E1208 09:02:56.425242 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="800ms" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.046114 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"dd7963498bcd26877a738bfe0ae1d4b0053ed40071a38f5da4036ddc12a0d168"} Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.047424 4776 status_manager.go:851] "Failed to get status for pod" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" pod="openshift-marketplace/redhat-marketplace-ghlj2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ghlj2\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:57 crc kubenswrapper[4776]: E1208 09:02:57.047429 4776 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.82:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.047894 4776 status_manager.go:851] "Failed to get status for pod" podUID="2345c2a2-29d5-4542-b83f-7e57fcd16d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:57 crc kubenswrapper[4776]: E1208 09:02:57.225949 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="1.6s" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.316664 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.317401 4776 status_manager.go:851] "Failed to get status for pod" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" pod="openshift-marketplace/redhat-marketplace-ghlj2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ghlj2\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.317870 4776 status_manager.go:851] "Failed to get status for pod" podUID="2345c2a2-29d5-4542-b83f-7e57fcd16d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.434004 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2345c2a2-29d5-4542-b83f-7e57fcd16d77-kubelet-dir\") pod \"2345c2a2-29d5-4542-b83f-7e57fcd16d77\" (UID: \"2345c2a2-29d5-4542-b83f-7e57fcd16d77\") " Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.434155 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2345c2a2-29d5-4542-b83f-7e57fcd16d77-kube-api-access\") pod \"2345c2a2-29d5-4542-b83f-7e57fcd16d77\" (UID: \"2345c2a2-29d5-4542-b83f-7e57fcd16d77\") " Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.434219 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2345c2a2-29d5-4542-b83f-7e57fcd16d77-var-lock\") pod \"2345c2a2-29d5-4542-b83f-7e57fcd16d77\" (UID: \"2345c2a2-29d5-4542-b83f-7e57fcd16d77\") " Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.434163 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2345c2a2-29d5-4542-b83f-7e57fcd16d77-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2345c2a2-29d5-4542-b83f-7e57fcd16d77" (UID: "2345c2a2-29d5-4542-b83f-7e57fcd16d77"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.434349 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2345c2a2-29d5-4542-b83f-7e57fcd16d77-var-lock" (OuterVolumeSpecName: "var-lock") pod "2345c2a2-29d5-4542-b83f-7e57fcd16d77" (UID: "2345c2a2-29d5-4542-b83f-7e57fcd16d77"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.434565 4776 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2345c2a2-29d5-4542-b83f-7e57fcd16d77-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.434579 4776 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2345c2a2-29d5-4542-b83f-7e57fcd16d77-var-lock\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.439553 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2345c2a2-29d5-4542-b83f-7e57fcd16d77-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2345c2a2-29d5-4542-b83f-7e57fcd16d77" (UID: "2345c2a2-29d5-4542-b83f-7e57fcd16d77"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.548896 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2345c2a2-29d5-4542-b83f-7e57fcd16d77-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.866513 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.867692 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.868310 4776 status_manager.go:851] "Failed to get status for pod" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" pod="openshift-marketplace/redhat-marketplace-ghlj2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ghlj2\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.868678 4776 status_manager.go:851] "Failed to get status for pod" podUID="2345c2a2-29d5-4542-b83f-7e57fcd16d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.869095 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.953907 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.954043 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.954067 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.954105 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.954108 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.954121 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.954445 4776 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.954470 4776 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:57 crc kubenswrapper[4776]: I1208 09:02:57.954481 4776 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.059333 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.060156 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144" exitCode=0 Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.060237 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.060339 4776 scope.go:117] "RemoveContainer" containerID="3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.062642 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.062640 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"2345c2a2-29d5-4542-b83f-7e57fcd16d77","Type":"ContainerDied","Data":"f89572d1c0cea5e655c60e6f168e1bb2b119f9c95224adf03c6a42540f0c9977"} Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.062685 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f89572d1c0cea5e655c60e6f168e1bb2b119f9c95224adf03c6a42540f0c9977" Dec 08 09:02:58 crc kubenswrapper[4776]: E1208 09:02:58.063549 4776 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.82:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.083784 4776 scope.go:117] "RemoveContainer" containerID="ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.086229 4776 status_manager.go:851] "Failed to get status for pod" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" pod="openshift-marketplace/redhat-marketplace-ghlj2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ghlj2\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.086581 4776 status_manager.go:851] "Failed to get status for pod" podUID="2345c2a2-29d5-4542-b83f-7e57fcd16d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.087017 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.088146 4776 status_manager.go:851] "Failed to get status for pod" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" pod="openshift-marketplace/redhat-marketplace-ghlj2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ghlj2\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.088352 4776 status_manager.go:851] "Failed to get status for pod" podUID="2345c2a2-29d5-4542-b83f-7e57fcd16d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.088503 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.096963 4776 scope.go:117] "RemoveContainer" containerID="779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.110908 4776 scope.go:117] "RemoveContainer" containerID="7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.124118 4776 scope.go:117] "RemoveContainer" containerID="b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.141030 4776 scope.go:117] "RemoveContainer" containerID="cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.157512 4776 scope.go:117] "RemoveContainer" containerID="3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008" Dec 08 09:02:58 crc kubenswrapper[4776]: E1208 09:02:58.157824 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\": container with ID starting with 3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008 not found: ID does not exist" containerID="3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.157855 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008"} err="failed to get container status \"3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\": rpc error: code = NotFound desc = could not find container \"3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008\": container with ID starting with 3e207fa48d50c86c3b8d11f24077632a94829bd2d8608a89135e89b42df26008 not found: ID does not exist" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.157878 4776 scope.go:117] "RemoveContainer" containerID="ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3" Dec 08 09:02:58 crc kubenswrapper[4776]: E1208 09:02:58.158091 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\": container with ID starting with ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3 not found: ID does not exist" containerID="ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.158123 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3"} err="failed to get container status \"ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\": rpc error: code = NotFound desc = could not find container \"ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3\": container with ID starting with ee00c7d6720b72fb9ecc27cc1ad2f762db9ac30bc3e108e37e2d4e2a96639cf3 not found: ID does not exist" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.158142 4776 scope.go:117] "RemoveContainer" containerID="779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe" Dec 08 09:02:58 crc kubenswrapper[4776]: E1208 09:02:58.158398 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\": container with ID starting with 779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe not found: ID does not exist" containerID="779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.158430 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe"} err="failed to get container status \"779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\": rpc error: code = NotFound desc = could not find container \"779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe\": container with ID starting with 779b9d427d3724323240befbe48f8fc66450fb51f1a16fd58266c25d7e5058fe not found: ID does not exist" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.158452 4776 scope.go:117] "RemoveContainer" containerID="7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608" Dec 08 09:02:58 crc kubenswrapper[4776]: E1208 09:02:58.158959 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\": container with ID starting with 7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608 not found: ID does not exist" containerID="7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.158987 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608"} err="failed to get container status \"7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\": rpc error: code = NotFound desc = could not find container \"7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608\": container with ID starting with 7daf3b23e3f49ba8a9f184de8397ed743dd4403b759e7118278cd00db4a75608 not found: ID does not exist" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.159004 4776 scope.go:117] "RemoveContainer" containerID="b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144" Dec 08 09:02:58 crc kubenswrapper[4776]: E1208 09:02:58.159352 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\": container with ID starting with b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144 not found: ID does not exist" containerID="b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.159377 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144"} err="failed to get container status \"b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\": rpc error: code = NotFound desc = could not find container \"b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144\": container with ID starting with b520a87c4f97c5bd7172ace791fe2225927dd55d9e31d22e24ec238d546ac144 not found: ID does not exist" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.159391 4776 scope.go:117] "RemoveContainer" containerID="cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b" Dec 08 09:02:58 crc kubenswrapper[4776]: E1208 09:02:58.159698 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\": container with ID starting with cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b not found: ID does not exist" containerID="cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.159719 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b"} err="failed to get container status \"cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\": rpc error: code = NotFound desc = could not find container \"cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b\": container with ID starting with cac5ce0dc731c732023e076b7fe69ca5d986b3770b52bb42d391e27955e97a9b not found: ID does not exist" Dec 08 09:02:58 crc kubenswrapper[4776]: I1208 09:02:58.349906 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 08 09:02:58 crc kubenswrapper[4776]: E1208 09:02:58.827506 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="3.2s" Dec 08 09:03:01 crc kubenswrapper[4776]: E1208 09:03:01.712060 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:03:01Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:03:01Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:03:01Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T09:03:01Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:03:01 crc kubenswrapper[4776]: E1208 09:03:01.712846 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:03:01 crc kubenswrapper[4776]: E1208 09:03:01.713108 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:03:01 crc kubenswrapper[4776]: E1208 09:03:01.713385 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:03:01 crc kubenswrapper[4776]: E1208 09:03:01.713676 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:03:01 crc kubenswrapper[4776]: E1208 09:03:01.713699 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 08 09:03:02 crc kubenswrapper[4776]: E1208 09:03:02.028785 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="6.4s" Dec 08 09:03:03 crc kubenswrapper[4776]: E1208 09:03:03.265697 4776 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.82:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187f3204f8039c54 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-08 09:02:55.874767956 +0000 UTC m=+252.137992978,LastTimestamp:2025-12-08 09:02:55.874767956 +0000 UTC m=+252.137992978,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 08 09:03:04 crc kubenswrapper[4776]: I1208 09:03:04.346281 4776 status_manager.go:851] "Failed to get status for pod" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" pod="openshift-marketplace/redhat-marketplace-ghlj2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ghlj2\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:03:04 crc kubenswrapper[4776]: I1208 09:03:04.346853 4776 status_manager.go:851] "Failed to get status for pod" podUID="2345c2a2-29d5-4542-b83f-7e57fcd16d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:03:08 crc kubenswrapper[4776]: I1208 09:03:08.343684 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:03:08 crc kubenswrapper[4776]: I1208 09:03:08.344628 4776 status_manager.go:851] "Failed to get status for pod" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" pod="openshift-marketplace/redhat-marketplace-ghlj2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ghlj2\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:03:08 crc kubenswrapper[4776]: I1208 09:03:08.344903 4776 status_manager.go:851] "Failed to get status for pod" podUID="2345c2a2-29d5-4542-b83f-7e57fcd16d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:03:08 crc kubenswrapper[4776]: I1208 09:03:08.356476 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="343d6e00-54f7-4228-a8da-a43041894b26" Dec 08 09:03:08 crc kubenswrapper[4776]: I1208 09:03:08.356505 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="343d6e00-54f7-4228-a8da-a43041894b26" Dec 08 09:03:08 crc kubenswrapper[4776]: E1208 09:03:08.356850 4776 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:03:08 crc kubenswrapper[4776]: I1208 09:03:08.357242 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:03:08 crc kubenswrapper[4776]: W1208 09:03:08.379282 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-57fdf553184c831746b423daf6faa35a2b134fbbd02443d95339747c0ef2bca7 WatchSource:0}: Error finding container 57fdf553184c831746b423daf6faa35a2b134fbbd02443d95339747c0ef2bca7: Status 404 returned error can't find the container with id 57fdf553184c831746b423daf6faa35a2b134fbbd02443d95339747c0ef2bca7 Dec 08 09:03:08 crc kubenswrapper[4776]: E1208 09:03:08.380636 4776 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.82:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" volumeName="registry-storage" Dec 08 09:03:08 crc kubenswrapper[4776]: E1208 09:03:08.429750 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="7s" Dec 08 09:03:09 crc kubenswrapper[4776]: I1208 09:03:09.129361 4776 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="65ccb250522ea370bfab0ed309cfe879e9dccb588523e0b03f501e76c5937c75" exitCode=0 Dec 08 09:03:09 crc kubenswrapper[4776]: I1208 09:03:09.129403 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"65ccb250522ea370bfab0ed309cfe879e9dccb588523e0b03f501e76c5937c75"} Dec 08 09:03:09 crc kubenswrapper[4776]: I1208 09:03:09.129428 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"57fdf553184c831746b423daf6faa35a2b134fbbd02443d95339747c0ef2bca7"} Dec 08 09:03:09 crc kubenswrapper[4776]: I1208 09:03:09.129638 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="343d6e00-54f7-4228-a8da-a43041894b26" Dec 08 09:03:09 crc kubenswrapper[4776]: I1208 09:03:09.129652 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="343d6e00-54f7-4228-a8da-a43041894b26" Dec 08 09:03:09 crc kubenswrapper[4776]: I1208 09:03:09.130151 4776 status_manager.go:851] "Failed to get status for pod" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" pod="openshift-marketplace/redhat-marketplace-ghlj2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ghlj2\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:03:09 crc kubenswrapper[4776]: E1208 09:03:09.130243 4776 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:03:09 crc kubenswrapper[4776]: I1208 09:03:09.130331 4776 status_manager.go:851] "Failed to get status for pod" podUID="2345c2a2-29d5-4542-b83f-7e57fcd16d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Dec 08 09:03:11 crc kubenswrapper[4776]: I1208 09:03:11.145538 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"555b4d95e2d7cd82fa324acda59961623be29e85358cc1da082956aa9a623ef8"} Dec 08 09:03:11 crc kubenswrapper[4776]: I1208 09:03:11.145787 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e1947182fbfa94eb35d8c7feca8843382638c93df92decd701ede4ad498da436"} Dec 08 09:03:11 crc kubenswrapper[4776]: I1208 09:03:11.148371 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 08 09:03:11 crc kubenswrapper[4776]: I1208 09:03:11.148418 4776 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5" exitCode=1 Dec 08 09:03:11 crc kubenswrapper[4776]: I1208 09:03:11.148442 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5"} Dec 08 09:03:11 crc kubenswrapper[4776]: I1208 09:03:11.148807 4776 scope.go:117] "RemoveContainer" containerID="fb3d5082006dae792f107dc632bc00d5004be69f95f6d76150d816fb542ce9a5" Dec 08 09:03:12 crc kubenswrapper[4776]: I1208 09:03:12.155717 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"98783352d7bc3cbab079956dffad9a613c9086536b7f3e4d9b1b8fbc8dae06eb"} Dec 08 09:03:12 crc kubenswrapper[4776]: I1208 09:03:12.156040 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b3674167f04cbed69051774136abace8a8b8ae08f68feb0f6f9d279f44051b17"} Dec 08 09:03:12 crc kubenswrapper[4776]: I1208 09:03:12.156067 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:03:12 crc kubenswrapper[4776]: I1208 09:03:12.156078 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"17b44cdca611e46b2b62e47585bd688de444e0322b9f36adb2d33be8baaeec5b"} Dec 08 09:03:12 crc kubenswrapper[4776]: I1208 09:03:12.155907 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="343d6e00-54f7-4228-a8da-a43041894b26" Dec 08 09:03:12 crc kubenswrapper[4776]: I1208 09:03:12.156099 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="343d6e00-54f7-4228-a8da-a43041894b26" Dec 08 09:03:12 crc kubenswrapper[4776]: I1208 09:03:12.159632 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 08 09:03:12 crc kubenswrapper[4776]: I1208 09:03:12.159682 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7579f5dfa333741d353850b67185d35a950936935b123c870c1378b1aff4feb6"} Dec 08 09:03:13 crc kubenswrapper[4776]: I1208 09:03:13.357554 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:03:13 crc kubenswrapper[4776]: I1208 09:03:13.357606 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:03:13 crc kubenswrapper[4776]: I1208 09:03:13.362816 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:03:15 crc kubenswrapper[4776]: I1208 09:03:15.274482 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 09:03:16 crc kubenswrapper[4776]: I1208 09:03:16.683277 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" podUID="7c6fbdd6-0243-4372-a986-cc73d2df8a74" containerName="oauth-openshift" containerID="cri-o://e5b9823870b0724c69f69dbb71118b122bd7b086be17c88bfdcb3ca21531d179" gracePeriod=15 Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.165748 4776 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.474650 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.474859 4776 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.476082 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.580947 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.627546 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh4fd\" (UniqueName: \"kubernetes.io/projected/7c6fbdd6-0243-4372-a986-cc73d2df8a74-kube-api-access-rh4fd\") pod \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.627588 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c6fbdd6-0243-4372-a986-cc73d2df8a74-audit-dir\") pod \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.627609 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-service-ca\") pod \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.627682 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-audit-policies\") pod \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.627695 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c6fbdd6-0243-4372-a986-cc73d2df8a74-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7c6fbdd6-0243-4372-a986-cc73d2df8a74" (UID: "7c6fbdd6-0243-4372-a986-cc73d2df8a74"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.627709 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-session\") pod \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.627782 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-provider-selection\") pod \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.627815 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-serving-cert\") pod \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.627855 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-router-certs\") pod \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.627873 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-login\") pod \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.627898 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-error\") pod \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.627936 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-ocp-branding-template\") pod \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.627956 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-idp-0-file-data\") pod \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.627979 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-cliconfig\") pod \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.627997 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-trusted-ca-bundle\") pod \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\" (UID: \"7c6fbdd6-0243-4372-a986-cc73d2df8a74\") " Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.628351 4776 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c6fbdd6-0243-4372-a986-cc73d2df8a74-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.628627 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "7c6fbdd6-0243-4372-a986-cc73d2df8a74" (UID: "7c6fbdd6-0243-4372-a986-cc73d2df8a74"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.629046 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "7c6fbdd6-0243-4372-a986-cc73d2df8a74" (UID: "7c6fbdd6-0243-4372-a986-cc73d2df8a74"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.629260 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "7c6fbdd6-0243-4372-a986-cc73d2df8a74" (UID: "7c6fbdd6-0243-4372-a986-cc73d2df8a74"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.629318 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "7c6fbdd6-0243-4372-a986-cc73d2df8a74" (UID: "7c6fbdd6-0243-4372-a986-cc73d2df8a74"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.634572 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "7c6fbdd6-0243-4372-a986-cc73d2df8a74" (UID: "7c6fbdd6-0243-4372-a986-cc73d2df8a74"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.635217 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "7c6fbdd6-0243-4372-a986-cc73d2df8a74" (UID: "7c6fbdd6-0243-4372-a986-cc73d2df8a74"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.643120 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c6fbdd6-0243-4372-a986-cc73d2df8a74-kube-api-access-rh4fd" (OuterVolumeSpecName: "kube-api-access-rh4fd") pod "7c6fbdd6-0243-4372-a986-cc73d2df8a74" (UID: "7c6fbdd6-0243-4372-a986-cc73d2df8a74"). InnerVolumeSpecName "kube-api-access-rh4fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.643508 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "7c6fbdd6-0243-4372-a986-cc73d2df8a74" (UID: "7c6fbdd6-0243-4372-a986-cc73d2df8a74"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.643798 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "7c6fbdd6-0243-4372-a986-cc73d2df8a74" (UID: "7c6fbdd6-0243-4372-a986-cc73d2df8a74"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.644397 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "7c6fbdd6-0243-4372-a986-cc73d2df8a74" (UID: "7c6fbdd6-0243-4372-a986-cc73d2df8a74"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.644622 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "7c6fbdd6-0243-4372-a986-cc73d2df8a74" (UID: "7c6fbdd6-0243-4372-a986-cc73d2df8a74"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.644808 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "7c6fbdd6-0243-4372-a986-cc73d2df8a74" (UID: "7c6fbdd6-0243-4372-a986-cc73d2df8a74"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.646612 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "7c6fbdd6-0243-4372-a986-cc73d2df8a74" (UID: "7c6fbdd6-0243-4372-a986-cc73d2df8a74"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.729461 4776 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.729494 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.729505 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.729515 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.729525 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.729534 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.729542 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.729550 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.729560 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.729568 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.729577 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.729585 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh4fd\" (UniqueName: \"kubernetes.io/projected/7c6fbdd6-0243-4372-a986-cc73d2df8a74-kube-api-access-rh4fd\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:17 crc kubenswrapper[4776]: I1208 09:03:17.729593 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c6fbdd6-0243-4372-a986-cc73d2df8a74-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:18 crc kubenswrapper[4776]: I1208 09:03:18.190352 4776 generic.go:334] "Generic (PLEG): container finished" podID="7c6fbdd6-0243-4372-a986-cc73d2df8a74" containerID="e5b9823870b0724c69f69dbb71118b122bd7b086be17c88bfdcb3ca21531d179" exitCode=0 Dec 08 09:03:18 crc kubenswrapper[4776]: I1208 09:03:18.190396 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" Dec 08 09:03:18 crc kubenswrapper[4776]: I1208 09:03:18.190396 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" event={"ID":"7c6fbdd6-0243-4372-a986-cc73d2df8a74","Type":"ContainerDied","Data":"e5b9823870b0724c69f69dbb71118b122bd7b086be17c88bfdcb3ca21531d179"} Dec 08 09:03:18 crc kubenswrapper[4776]: I1208 09:03:18.190565 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2hsh8" event={"ID":"7c6fbdd6-0243-4372-a986-cc73d2df8a74","Type":"ContainerDied","Data":"c704cd6bbc80e498fc518112b77621974ee6cb8ba5c965c7c86eb9ff3cfb3abb"} Dec 08 09:03:18 crc kubenswrapper[4776]: I1208 09:03:18.190605 4776 scope.go:117] "RemoveContainer" containerID="e5b9823870b0724c69f69dbb71118b122bd7b086be17c88bfdcb3ca21531d179" Dec 08 09:03:18 crc kubenswrapper[4776]: I1208 09:03:18.191097 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="343d6e00-54f7-4228-a8da-a43041894b26" Dec 08 09:03:18 crc kubenswrapper[4776]: I1208 09:03:18.191112 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="343d6e00-54f7-4228-a8da-a43041894b26" Dec 08 09:03:18 crc kubenswrapper[4776]: I1208 09:03:18.196034 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:03:18 crc kubenswrapper[4776]: I1208 09:03:18.201293 4776 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6839de96-f7a9-47a1-8ec3-2e849c74c2a5" Dec 08 09:03:18 crc kubenswrapper[4776]: I1208 09:03:18.221501 4776 scope.go:117] "RemoveContainer" containerID="e5b9823870b0724c69f69dbb71118b122bd7b086be17c88bfdcb3ca21531d179" Dec 08 09:03:18 crc kubenswrapper[4776]: E1208 09:03:18.221865 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5b9823870b0724c69f69dbb71118b122bd7b086be17c88bfdcb3ca21531d179\": container with ID starting with e5b9823870b0724c69f69dbb71118b122bd7b086be17c88bfdcb3ca21531d179 not found: ID does not exist" containerID="e5b9823870b0724c69f69dbb71118b122bd7b086be17c88bfdcb3ca21531d179" Dec 08 09:03:18 crc kubenswrapper[4776]: I1208 09:03:18.221896 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5b9823870b0724c69f69dbb71118b122bd7b086be17c88bfdcb3ca21531d179"} err="failed to get container status \"e5b9823870b0724c69f69dbb71118b122bd7b086be17c88bfdcb3ca21531d179\": rpc error: code = NotFound desc = could not find container \"e5b9823870b0724c69f69dbb71118b122bd7b086be17c88bfdcb3ca21531d179\": container with ID starting with e5b9823870b0724c69f69dbb71118b122bd7b086be17c88bfdcb3ca21531d179 not found: ID does not exist" Dec 08 09:03:19 crc kubenswrapper[4776]: I1208 09:03:19.196624 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="343d6e00-54f7-4228-a8da-a43041894b26" Dec 08 09:03:19 crc kubenswrapper[4776]: I1208 09:03:19.196928 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="343d6e00-54f7-4228-a8da-a43041894b26" Dec 08 09:03:24 crc kubenswrapper[4776]: I1208 09:03:24.358801 4776 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6839de96-f7a9-47a1-8ec3-2e849c74c2a5" Dec 08 09:03:26 crc kubenswrapper[4776]: I1208 09:03:26.086351 4776 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 08 09:03:27 crc kubenswrapper[4776]: I1208 09:03:27.008430 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 08 09:03:27 crc kubenswrapper[4776]: I1208 09:03:27.201466 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 08 09:03:27 crc kubenswrapper[4776]: I1208 09:03:27.477494 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 09:03:27 crc kubenswrapper[4776]: I1208 09:03:27.481504 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 09:03:27 crc kubenswrapper[4776]: I1208 09:03:27.675485 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 08 09:03:27 crc kubenswrapper[4776]: I1208 09:03:27.756285 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 08 09:03:27 crc kubenswrapper[4776]: I1208 09:03:27.912585 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 08 09:03:28 crc kubenswrapper[4776]: I1208 09:03:28.129431 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 08 09:03:28 crc kubenswrapper[4776]: I1208 09:03:28.155865 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 08 09:03:28 crc kubenswrapper[4776]: I1208 09:03:28.569228 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 08 09:03:28 crc kubenswrapper[4776]: I1208 09:03:28.597752 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 08 09:03:28 crc kubenswrapper[4776]: I1208 09:03:28.606480 4776 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 08 09:03:28 crc kubenswrapper[4776]: I1208 09:03:28.753474 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 08 09:03:28 crc kubenswrapper[4776]: I1208 09:03:28.775337 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 08 09:03:29 crc kubenswrapper[4776]: I1208 09:03:29.029088 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 08 09:03:29 crc kubenswrapper[4776]: I1208 09:03:29.058334 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 08 09:03:29 crc kubenswrapper[4776]: I1208 09:03:29.375875 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 08 09:03:29 crc kubenswrapper[4776]: I1208 09:03:29.689554 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 08 09:03:29 crc kubenswrapper[4776]: I1208 09:03:29.728911 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 08 09:03:29 crc kubenswrapper[4776]: I1208 09:03:29.730340 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 08 09:03:29 crc kubenswrapper[4776]: I1208 09:03:29.873260 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 08 09:03:29 crc kubenswrapper[4776]: I1208 09:03:29.908581 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 08 09:03:29 crc kubenswrapper[4776]: I1208 09:03:29.909543 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 08 09:03:29 crc kubenswrapper[4776]: I1208 09:03:29.925234 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 08 09:03:29 crc kubenswrapper[4776]: I1208 09:03:29.989979 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 08 09:03:30 crc kubenswrapper[4776]: I1208 09:03:30.053235 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 08 09:03:30 crc kubenswrapper[4776]: I1208 09:03:30.177431 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 08 09:03:30 crc kubenswrapper[4776]: I1208 09:03:30.211949 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 08 09:03:30 crc kubenswrapper[4776]: I1208 09:03:30.224137 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 08 09:03:30 crc kubenswrapper[4776]: I1208 09:03:30.351608 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 08 09:03:30 crc kubenswrapper[4776]: I1208 09:03:30.499840 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 08 09:03:30 crc kubenswrapper[4776]: I1208 09:03:30.637455 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 08 09:03:30 crc kubenswrapper[4776]: I1208 09:03:30.705369 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 08 09:03:30 crc kubenswrapper[4776]: I1208 09:03:30.778021 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 08 09:03:30 crc kubenswrapper[4776]: I1208 09:03:30.827579 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 08 09:03:30 crc kubenswrapper[4776]: I1208 09:03:30.841433 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.009079 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.071905 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.126711 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.149395 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.187337 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.221438 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.362588 4776 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.367262 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.369812 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.396155 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.549791 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.563215 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.573643 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.573749 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.593524 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.724403 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.784123 4776 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.789267 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghlj2","openshift-authentication/oauth-openshift-558db77b4-2hsh8","openshift-kube-apiserver/kube-apiserver-crc"] Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.789341 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-ccc74cc7-92gqt","openshift-kube-apiserver/kube-apiserver-crc"] Dec 08 09:03:31 crc kubenswrapper[4776]: E1208 09:03:31.789551 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" containerName="extract-content" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.789576 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" containerName="extract-content" Dec 08 09:03:31 crc kubenswrapper[4776]: E1208 09:03:31.789588 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" containerName="registry-server" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.789597 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" containerName="registry-server" Dec 08 09:03:31 crc kubenswrapper[4776]: E1208 09:03:31.789613 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" containerName="extract-utilities" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.789623 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" containerName="extract-utilities" Dec 08 09:03:31 crc kubenswrapper[4776]: E1208 09:03:31.789638 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2345c2a2-29d5-4542-b83f-7e57fcd16d77" containerName="installer" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.789646 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2345c2a2-29d5-4542-b83f-7e57fcd16d77" containerName="installer" Dec 08 09:03:31 crc kubenswrapper[4776]: E1208 09:03:31.789659 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c6fbdd6-0243-4372-a986-cc73d2df8a74" containerName="oauth-openshift" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.789668 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c6fbdd6-0243-4372-a986-cc73d2df8a74" containerName="oauth-openshift" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.789911 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="343d6e00-54f7-4228-a8da-a43041894b26" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.789943 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="343d6e00-54f7-4228-a8da-a43041894b26" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.790141 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c6fbdd6-0243-4372-a986-cc73d2df8a74" containerName="oauth-openshift" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.790168 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" containerName="registry-server" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.790246 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2345c2a2-29d5-4542-b83f-7e57fcd16d77" containerName="installer" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.790855 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.795069 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.795924 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.796275 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.796932 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.797500 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.798130 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.798299 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.798484 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.801094 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.801160 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.801095 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.801401 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.801643 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.802300 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.808911 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.812800 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.833702 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.833683569 podStartE2EDuration="14.833683569s" podCreationTimestamp="2025-12-08 09:03:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:03:31.817891669 +0000 UTC m=+288.081116691" watchObservedRunningTime="2025-12-08 09:03:31.833683569 +0000 UTC m=+288.096908591" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.880470 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.902028 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.902075 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.902226 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdhp7\" (UniqueName: \"kubernetes.io/projected/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-kube-api-access-wdhp7\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.902279 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-service-ca\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.902425 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-user-template-error\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.902477 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.902504 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.902556 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.902577 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-audit-policies\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.902619 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-user-template-login\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.902646 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-session\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.902668 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-audit-dir\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.902691 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-router-certs\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.902715 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:31 crc kubenswrapper[4776]: I1208 09:03:31.985743 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.004490 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-session\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.004553 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-audit-dir\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.004583 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-router-certs\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.004604 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.004658 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-audit-dir\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.004661 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.005532 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.005616 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdhp7\" (UniqueName: \"kubernetes.io/projected/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-kube-api-access-wdhp7\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.005659 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-service-ca\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.005728 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-user-template-error\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.005763 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.005794 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.005848 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.005886 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-audit-policies\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.005925 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-user-template-login\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.006578 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-service-ca\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.006576 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.007369 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.007727 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-audit-policies\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.010552 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-user-template-error\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.010666 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.011283 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-user-template-login\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.011320 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-router-certs\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.011516 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-session\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.011703 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.011927 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.021127 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.022829 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdhp7\" (UniqueName: \"kubernetes.io/projected/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4-kube-api-access-wdhp7\") pod \"oauth-openshift-ccc74cc7-92gqt\" (UID: \"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.095667 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.097808 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.112268 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.113605 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.153713 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.181319 4776 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.205865 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.350712 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.351231 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58f8ec13-d004-4d1f-8a44-f325c07c25fd" path="/var/lib/kubelet/pods/58f8ec13-d004-4d1f-8a44-f325c07c25fd/volumes" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.352432 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c6fbdd6-0243-4372-a986-cc73d2df8a74" path="/var/lib/kubelet/pods/7c6fbdd6-0243-4372-a986-cc73d2df8a74/volumes" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.420106 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.505848 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.560659 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.610270 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.833960 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 08 09:03:32 crc kubenswrapper[4776]: I1208 09:03:32.944891 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.012846 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.172382 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.220285 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.222071 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.277799 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.431847 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.486777 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.522776 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.574858 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.583430 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.645648 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.652829 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.675185 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-ccc74cc7-92gqt"] Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.690765 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.735300 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.740915 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.764804 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.782707 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.807826 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.840377 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 08 09:03:33 crc kubenswrapper[4776]: I1208 09:03:33.961874 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.037196 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.061065 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.115382 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.127163 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.151675 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.202005 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.251957 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.325391 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.336478 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.344723 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.397439 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.432699 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.548016 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.597304 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.727535 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.822430 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.866598 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.897911 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.930977 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 08 09:03:34 crc kubenswrapper[4776]: I1208 09:03:34.940456 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.016809 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 08 09:03:35 crc kubenswrapper[4776]: E1208 09:03:35.035079 4776 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 08 09:03:35 crc kubenswrapper[4776]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-ccc74cc7-92gqt_openshift-authentication_3f53218b-ed10-46c4-8c16-3cfd12a1c9f4_0(0bf3b68d74c13885640038031b8a92a30f83b23e382cf3a31d75481a9e916afb): error adding pod openshift-authentication_oauth-openshift-ccc74cc7-92gqt to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0bf3b68d74c13885640038031b8a92a30f83b23e382cf3a31d75481a9e916afb" Netns:"/var/run/netns/4862916d-a76c-4a2c-ba9a-34320fe2e088" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-ccc74cc7-92gqt;K8S_POD_INFRA_CONTAINER_ID=0bf3b68d74c13885640038031b8a92a30f83b23e382cf3a31d75481a9e916afb;K8S_POD_UID=3f53218b-ed10-46c4-8c16-3cfd12a1c9f4" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-ccc74cc7-92gqt] networking: Multus: [openshift-authentication/oauth-openshift-ccc74cc7-92gqt/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-ccc74cc7-92gqt in out of cluster comm: pod "oauth-openshift-ccc74cc7-92gqt" not found Dec 08 09:03:35 crc kubenswrapper[4776]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 08 09:03:35 crc kubenswrapper[4776]: > Dec 08 09:03:35 crc kubenswrapper[4776]: E1208 09:03:35.035164 4776 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 08 09:03:35 crc kubenswrapper[4776]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-ccc74cc7-92gqt_openshift-authentication_3f53218b-ed10-46c4-8c16-3cfd12a1c9f4_0(0bf3b68d74c13885640038031b8a92a30f83b23e382cf3a31d75481a9e916afb): error adding pod openshift-authentication_oauth-openshift-ccc74cc7-92gqt to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0bf3b68d74c13885640038031b8a92a30f83b23e382cf3a31d75481a9e916afb" Netns:"/var/run/netns/4862916d-a76c-4a2c-ba9a-34320fe2e088" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-ccc74cc7-92gqt;K8S_POD_INFRA_CONTAINER_ID=0bf3b68d74c13885640038031b8a92a30f83b23e382cf3a31d75481a9e916afb;K8S_POD_UID=3f53218b-ed10-46c4-8c16-3cfd12a1c9f4" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-ccc74cc7-92gqt] networking: Multus: [openshift-authentication/oauth-openshift-ccc74cc7-92gqt/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-ccc74cc7-92gqt in out of cluster comm: pod "oauth-openshift-ccc74cc7-92gqt" not found Dec 08 09:03:35 crc kubenswrapper[4776]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 08 09:03:35 crc kubenswrapper[4776]: > pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:35 crc kubenswrapper[4776]: E1208 09:03:35.035208 4776 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 08 09:03:35 crc kubenswrapper[4776]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-ccc74cc7-92gqt_openshift-authentication_3f53218b-ed10-46c4-8c16-3cfd12a1c9f4_0(0bf3b68d74c13885640038031b8a92a30f83b23e382cf3a31d75481a9e916afb): error adding pod openshift-authentication_oauth-openshift-ccc74cc7-92gqt to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0bf3b68d74c13885640038031b8a92a30f83b23e382cf3a31d75481a9e916afb" Netns:"/var/run/netns/4862916d-a76c-4a2c-ba9a-34320fe2e088" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-ccc74cc7-92gqt;K8S_POD_INFRA_CONTAINER_ID=0bf3b68d74c13885640038031b8a92a30f83b23e382cf3a31d75481a9e916afb;K8S_POD_UID=3f53218b-ed10-46c4-8c16-3cfd12a1c9f4" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-ccc74cc7-92gqt] networking: Multus: [openshift-authentication/oauth-openshift-ccc74cc7-92gqt/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-ccc74cc7-92gqt in out of cluster comm: pod "oauth-openshift-ccc74cc7-92gqt" not found Dec 08 09:03:35 crc kubenswrapper[4776]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 08 09:03:35 crc kubenswrapper[4776]: > pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:35 crc kubenswrapper[4776]: E1208 09:03:35.035271 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-ccc74cc7-92gqt_openshift-authentication(3f53218b-ed10-46c4-8c16-3cfd12a1c9f4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-ccc74cc7-92gqt_openshift-authentication(3f53218b-ed10-46c4-8c16-3cfd12a1c9f4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-ccc74cc7-92gqt_openshift-authentication_3f53218b-ed10-46c4-8c16-3cfd12a1c9f4_0(0bf3b68d74c13885640038031b8a92a30f83b23e382cf3a31d75481a9e916afb): error adding pod openshift-authentication_oauth-openshift-ccc74cc7-92gqt to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"0bf3b68d74c13885640038031b8a92a30f83b23e382cf3a31d75481a9e916afb\\\" Netns:\\\"/var/run/netns/4862916d-a76c-4a2c-ba9a-34320fe2e088\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-ccc74cc7-92gqt;K8S_POD_INFRA_CONTAINER_ID=0bf3b68d74c13885640038031b8a92a30f83b23e382cf3a31d75481a9e916afb;K8S_POD_UID=3f53218b-ed10-46c4-8c16-3cfd12a1c9f4\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-ccc74cc7-92gqt] networking: Multus: [openshift-authentication/oauth-openshift-ccc74cc7-92gqt/3f53218b-ed10-46c4-8c16-3cfd12a1c9f4]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-ccc74cc7-92gqt in out of cluster comm: pod \\\"oauth-openshift-ccc74cc7-92gqt\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" podUID="3f53218b-ed10-46c4-8c16-3cfd12a1c9f4" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.037252 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.092433 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.225012 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.271236 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.275834 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.277050 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.316104 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.359134 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.415350 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.447698 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.546426 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.611051 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.637950 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.716371 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.845371 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.880606 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 08 09:03:35 crc kubenswrapper[4776]: I1208 09:03:35.952700 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.041657 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.048746 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.050866 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.091035 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.129272 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.196675 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.196675 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.198240 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.221770 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.242075 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.243905 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.281621 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.310845 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.585551 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.713269 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.778495 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.781690 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.785793 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.789379 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.826891 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.840891 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.870048 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.935562 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.953948 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 08 09:03:36 crc kubenswrapper[4776]: I1208 09:03:36.960310 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 08 09:03:37 crc kubenswrapper[4776]: I1208 09:03:37.042056 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 08 09:03:37 crc kubenswrapper[4776]: I1208 09:03:37.187526 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 08 09:03:37 crc kubenswrapper[4776]: I1208 09:03:37.246455 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 08 09:03:37 crc kubenswrapper[4776]: I1208 09:03:37.414079 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 08 09:03:37 crc kubenswrapper[4776]: I1208 09:03:37.646594 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 08 09:03:37 crc kubenswrapper[4776]: I1208 09:03:37.683598 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 08 09:03:37 crc kubenswrapper[4776]: I1208 09:03:37.748772 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 08 09:03:37 crc kubenswrapper[4776]: I1208 09:03:37.782469 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 08 09:03:37 crc kubenswrapper[4776]: I1208 09:03:37.813393 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 08 09:03:37 crc kubenswrapper[4776]: I1208 09:03:37.857313 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 08 09:03:37 crc kubenswrapper[4776]: I1208 09:03:37.876933 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 08 09:03:37 crc kubenswrapper[4776]: I1208 09:03:37.883546 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 08 09:03:37 crc kubenswrapper[4776]: I1208 09:03:37.913083 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 08 09:03:37 crc kubenswrapper[4776]: I1208 09:03:37.922838 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 08 09:03:37 crc kubenswrapper[4776]: I1208 09:03:37.971489 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-ccc74cc7-92gqt"] Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.141582 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.150983 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.274376 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.292348 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" event={"ID":"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4","Type":"ContainerStarted","Data":"030c5db5e816a883bf4fcfb8cf78743b758ebcf9b099c7589b0240f6f808e3d4"} Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.292387 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" event={"ID":"3f53218b-ed10-46c4-8c16-3cfd12a1c9f4","Type":"ContainerStarted","Data":"abcee9434c75f43a238a94bf2f01c38ebd8bd5188bc45ede3d03fe662c9e6016"} Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.292646 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.311076 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.321645 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" podStartSLOduration=47.321603995 podStartE2EDuration="47.321603995s" podCreationTimestamp="2025-12-08 09:02:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:03:38.316734873 +0000 UTC m=+294.579959915" watchObservedRunningTime="2025-12-08 09:03:38.321603995 +0000 UTC m=+294.584829037" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.356241 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.376442 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.384774 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.417630 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.440977 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.456801 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.521093 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.531730 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.553630 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.600876 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.689827 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.712026 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.804116 4776 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.827252 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.862325 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.877484 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.878492 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 08 09:03:38 crc kubenswrapper[4776]: I1208 09:03:38.898717 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-ccc74cc7-92gqt" Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.105703 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.193505 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.208038 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.233272 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.243080 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.328633 4776 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.328948 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://dd7963498bcd26877a738bfe0ae1d4b0053ed40071a38f5da4036ddc12a0d168" gracePeriod=5 Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.350135 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.542511 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.556258 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.615740 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.652513 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.662656 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.665872 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.669896 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.796321 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.946316 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 08 09:03:39 crc kubenswrapper[4776]: I1208 09:03:39.947358 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 08 09:03:40 crc kubenswrapper[4776]: I1208 09:03:40.069073 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 08 09:03:40 crc kubenswrapper[4776]: I1208 09:03:40.228729 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 08 09:03:40 crc kubenswrapper[4776]: I1208 09:03:40.244828 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 08 09:03:40 crc kubenswrapper[4776]: I1208 09:03:40.286071 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 08 09:03:40 crc kubenswrapper[4776]: I1208 09:03:40.290837 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 08 09:03:40 crc kubenswrapper[4776]: I1208 09:03:40.320097 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 08 09:03:40 crc kubenswrapper[4776]: I1208 09:03:40.419327 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 08 09:03:40 crc kubenswrapper[4776]: I1208 09:03:40.430196 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 08 09:03:40 crc kubenswrapper[4776]: I1208 09:03:40.457994 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 08 09:03:40 crc kubenswrapper[4776]: I1208 09:03:40.573849 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 08 09:03:40 crc kubenswrapper[4776]: I1208 09:03:40.666489 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 08 09:03:40 crc kubenswrapper[4776]: I1208 09:03:40.837660 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 08 09:03:40 crc kubenswrapper[4776]: I1208 09:03:40.996152 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 08 09:03:41 crc kubenswrapper[4776]: I1208 09:03:41.095739 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 08 09:03:41 crc kubenswrapper[4776]: I1208 09:03:41.116244 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 08 09:03:41 crc kubenswrapper[4776]: I1208 09:03:41.186520 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 08 09:03:41 crc kubenswrapper[4776]: I1208 09:03:41.447074 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 08 09:03:41 crc kubenswrapper[4776]: I1208 09:03:41.504859 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 08 09:03:41 crc kubenswrapper[4776]: I1208 09:03:41.524408 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 08 09:03:41 crc kubenswrapper[4776]: I1208 09:03:41.575656 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 08 09:03:41 crc kubenswrapper[4776]: I1208 09:03:41.628237 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 08 09:03:41 crc kubenswrapper[4776]: I1208 09:03:41.641037 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 08 09:03:41 crc kubenswrapper[4776]: I1208 09:03:41.821231 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 08 09:03:41 crc kubenswrapper[4776]: I1208 09:03:41.833531 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 08 09:03:41 crc kubenswrapper[4776]: I1208 09:03:41.945874 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 08 09:03:42 crc kubenswrapper[4776]: I1208 09:03:42.042480 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 08 09:03:42 crc kubenswrapper[4776]: I1208 09:03:42.094985 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 08 09:03:42 crc kubenswrapper[4776]: I1208 09:03:42.124199 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 08 09:03:42 crc kubenswrapper[4776]: I1208 09:03:42.198265 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 08 09:03:42 crc kubenswrapper[4776]: I1208 09:03:42.585033 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 08 09:03:42 crc kubenswrapper[4776]: I1208 09:03:42.762551 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 08 09:03:42 crc kubenswrapper[4776]: I1208 09:03:42.822681 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 08 09:03:43 crc kubenswrapper[4776]: I1208 09:03:43.264161 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 08 09:03:43 crc kubenswrapper[4776]: I1208 09:03:43.286563 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 08 09:03:43 crc kubenswrapper[4776]: I1208 09:03:43.790497 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 08 09:03:43 crc kubenswrapper[4776]: I1208 09:03:43.864042 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 08 09:03:43 crc kubenswrapper[4776]: I1208 09:03:43.867417 4776 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 08 09:03:44 crc kubenswrapper[4776]: I1208 09:03:44.152270 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 08 09:03:44 crc kubenswrapper[4776]: I1208 09:03:44.224419 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.271012 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.271304 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.342651 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.342700 4776 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="dd7963498bcd26877a738bfe0ae1d4b0053ed40071a38f5da4036ddc12a0d168" exitCode=137 Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.342797 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.342882 4776 scope.go:117] "RemoveContainer" containerID="dd7963498bcd26877a738bfe0ae1d4b0053ed40071a38f5da4036ddc12a0d168" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.355962 4776 scope.go:117] "RemoveContainer" containerID="dd7963498bcd26877a738bfe0ae1d4b0053ed40071a38f5da4036ddc12a0d168" Dec 08 09:03:45 crc kubenswrapper[4776]: E1208 09:03:45.356513 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd7963498bcd26877a738bfe0ae1d4b0053ed40071a38f5da4036ddc12a0d168\": container with ID starting with dd7963498bcd26877a738bfe0ae1d4b0053ed40071a38f5da4036ddc12a0d168 not found: ID does not exist" containerID="dd7963498bcd26877a738bfe0ae1d4b0053ed40071a38f5da4036ddc12a0d168" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.356547 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd7963498bcd26877a738bfe0ae1d4b0053ed40071a38f5da4036ddc12a0d168"} err="failed to get container status \"dd7963498bcd26877a738bfe0ae1d4b0053ed40071a38f5da4036ddc12a0d168\": rpc error: code = NotFound desc = could not find container \"dd7963498bcd26877a738bfe0ae1d4b0053ed40071a38f5da4036ddc12a0d168\": container with ID starting with dd7963498bcd26877a738bfe0ae1d4b0053ed40071a38f5da4036ddc12a0d168 not found: ID does not exist" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.397291 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.397353 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.397458 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.397451 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.397513 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.397521 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.397585 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.397599 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.397646 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.398020 4776 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.398032 4776 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.398040 4776 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.398050 4776 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.406213 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:03:45 crc kubenswrapper[4776]: I1208 09:03:45.499276 4776 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 08 09:03:46 crc kubenswrapper[4776]: I1208 09:03:46.351676 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 08 09:04:09 crc kubenswrapper[4776]: I1208 09:04:09.255935 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ld6f6"] Dec 08 09:04:09 crc kubenswrapper[4776]: I1208 09:04:09.256689 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" podUID="0da3b83e-efc3-4e6d-b876-186f430d3d77" containerName="controller-manager" containerID="cri-o://f647a946cfc0edec4d9245ad0d9fe8df27b9dcede1b57c82239afb3122c6392f" gracePeriod=30 Dec 08 09:04:09 crc kubenswrapper[4776]: I1208 09:04:09.369141 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b"] Dec 08 09:04:09 crc kubenswrapper[4776]: I1208 09:04:09.369735 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" podUID="c2c04832-2cf3-4401-bf58-b2b5624e5c97" containerName="route-controller-manager" containerID="cri-o://e1fd7654e1547b696356eb9e25f3eaef31623c4a66478f356828e257f4725f4f" gracePeriod=30 Dec 08 09:04:09 crc kubenswrapper[4776]: I1208 09:04:09.466842 4776 generic.go:334] "Generic (PLEG): container finished" podID="0da3b83e-efc3-4e6d-b876-186f430d3d77" containerID="f647a946cfc0edec4d9245ad0d9fe8df27b9dcede1b57c82239afb3122c6392f" exitCode=0 Dec 08 09:04:09 crc kubenswrapper[4776]: I1208 09:04:09.466883 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" event={"ID":"0da3b83e-efc3-4e6d-b876-186f430d3d77","Type":"ContainerDied","Data":"f647a946cfc0edec4d9245ad0d9fe8df27b9dcede1b57c82239afb3122c6392f"} Dec 08 09:04:09 crc kubenswrapper[4776]: I1208 09:04:09.736763 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:04:09 crc kubenswrapper[4776]: I1208 09:04:09.908539 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-client-ca\") pod \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " Dec 08 09:04:09 crc kubenswrapper[4776]: I1208 09:04:09.908616 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-config\") pod \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " Dec 08 09:04:09 crc kubenswrapper[4776]: I1208 09:04:09.908651 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdnk7\" (UniqueName: \"kubernetes.io/projected/c2c04832-2cf3-4401-bf58-b2b5624e5c97-kube-api-access-bdnk7\") pod \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " Dec 08 09:04:09 crc kubenswrapper[4776]: I1208 09:04:09.908697 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c04832-2cf3-4401-bf58-b2b5624e5c97-serving-cert\") pod \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\" (UID: \"c2c04832-2cf3-4401-bf58-b2b5624e5c97\") " Dec 08 09:04:09 crc kubenswrapper[4776]: I1208 09:04:09.909687 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-config" (OuterVolumeSpecName: "config") pod "c2c04832-2cf3-4401-bf58-b2b5624e5c97" (UID: "c2c04832-2cf3-4401-bf58-b2b5624e5c97"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:04:09 crc kubenswrapper[4776]: I1208 09:04:09.909884 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-client-ca" (OuterVolumeSpecName: "client-ca") pod "c2c04832-2cf3-4401-bf58-b2b5624e5c97" (UID: "c2c04832-2cf3-4401-bf58-b2b5624e5c97"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:04:09 crc kubenswrapper[4776]: I1208 09:04:09.921674 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2c04832-2cf3-4401-bf58-b2b5624e5c97-kube-api-access-bdnk7" (OuterVolumeSpecName: "kube-api-access-bdnk7") pod "c2c04832-2cf3-4401-bf58-b2b5624e5c97" (UID: "c2c04832-2cf3-4401-bf58-b2b5624e5c97"). InnerVolumeSpecName "kube-api-access-bdnk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:04:09 crc kubenswrapper[4776]: I1208 09:04:09.921864 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c04832-2cf3-4401-bf58-b2b5624e5c97-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c2c04832-2cf3-4401-bf58-b2b5624e5c97" (UID: "c2c04832-2cf3-4401-bf58-b2b5624e5c97"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.009846 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.009883 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdnk7\" (UniqueName: \"kubernetes.io/projected/c2c04832-2cf3-4401-bf58-b2b5624e5c97-kube-api-access-bdnk7\") on node \"crc\" DevicePath \"\"" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.009893 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c04832-2cf3-4401-bf58-b2b5624e5c97-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.009901 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2c04832-2cf3-4401-bf58-b2b5624e5c97-client-ca\") on node \"crc\" DevicePath \"\"" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.034738 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.212408 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-config\") pod \"0da3b83e-efc3-4e6d-b876-186f430d3d77\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.212728 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-client-ca\") pod \"0da3b83e-efc3-4e6d-b876-186f430d3d77\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.212759 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0da3b83e-efc3-4e6d-b876-186f430d3d77-serving-cert\") pod \"0da3b83e-efc3-4e6d-b876-186f430d3d77\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.212778 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kthb8\" (UniqueName: \"kubernetes.io/projected/0da3b83e-efc3-4e6d-b876-186f430d3d77-kube-api-access-kthb8\") pod \"0da3b83e-efc3-4e6d-b876-186f430d3d77\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.212794 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-proxy-ca-bundles\") pod \"0da3b83e-efc3-4e6d-b876-186f430d3d77\" (UID: \"0da3b83e-efc3-4e6d-b876-186f430d3d77\") " Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.213454 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-client-ca" (OuterVolumeSpecName: "client-ca") pod "0da3b83e-efc3-4e6d-b876-186f430d3d77" (UID: "0da3b83e-efc3-4e6d-b876-186f430d3d77"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.213460 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0da3b83e-efc3-4e6d-b876-186f430d3d77" (UID: "0da3b83e-efc3-4e6d-b876-186f430d3d77"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.213768 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-config" (OuterVolumeSpecName: "config") pod "0da3b83e-efc3-4e6d-b876-186f430d3d77" (UID: "0da3b83e-efc3-4e6d-b876-186f430d3d77"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.216247 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0da3b83e-efc3-4e6d-b876-186f430d3d77-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0da3b83e-efc3-4e6d-b876-186f430d3d77" (UID: "0da3b83e-efc3-4e6d-b876-186f430d3d77"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.216303 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0da3b83e-efc3-4e6d-b876-186f430d3d77-kube-api-access-kthb8" (OuterVolumeSpecName: "kube-api-access-kthb8") pod "0da3b83e-efc3-4e6d-b876-186f430d3d77" (UID: "0da3b83e-efc3-4e6d-b876-186f430d3d77"). InnerVolumeSpecName "kube-api-access-kthb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.314559 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.314634 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-client-ca\") on node \"crc\" DevicePath \"\"" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.314647 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0da3b83e-efc3-4e6d-b876-186f430d3d77-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.314661 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kthb8\" (UniqueName: \"kubernetes.io/projected/0da3b83e-efc3-4e6d-b876-186f430d3d77-kube-api-access-kthb8\") on node \"crc\" DevicePath \"\"" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.314675 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0da3b83e-efc3-4e6d-b876-186f430d3d77-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.473829 4776 generic.go:334] "Generic (PLEG): container finished" podID="c2c04832-2cf3-4401-bf58-b2b5624e5c97" containerID="e1fd7654e1547b696356eb9e25f3eaef31623c4a66478f356828e257f4725f4f" exitCode=0 Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.473897 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" event={"ID":"c2c04832-2cf3-4401-bf58-b2b5624e5c97","Type":"ContainerDied","Data":"e1fd7654e1547b696356eb9e25f3eaef31623c4a66478f356828e257f4725f4f"} Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.473936 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" event={"ID":"c2c04832-2cf3-4401-bf58-b2b5624e5c97","Type":"ContainerDied","Data":"f2fee9ed6dd97a2ab0647414da25e0599a5acb29dc562ab73b69177c19a1d7e3"} Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.473951 4776 scope.go:117] "RemoveContainer" containerID="e1fd7654e1547b696356eb9e25f3eaef31623c4a66478f356828e257f4725f4f" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.474085 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.477575 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" event={"ID":"0da3b83e-efc3-4e6d-b876-186f430d3d77","Type":"ContainerDied","Data":"0a36975cbf5d05274abe9c0b361044ed57ac60e13bdb6247efc449a2fadeab20"} Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.477655 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ld6f6" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.497279 4776 scope.go:117] "RemoveContainer" containerID="e1fd7654e1547b696356eb9e25f3eaef31623c4a66478f356828e257f4725f4f" Dec 08 09:04:10 crc kubenswrapper[4776]: E1208 09:04:10.497662 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1fd7654e1547b696356eb9e25f3eaef31623c4a66478f356828e257f4725f4f\": container with ID starting with e1fd7654e1547b696356eb9e25f3eaef31623c4a66478f356828e257f4725f4f not found: ID does not exist" containerID="e1fd7654e1547b696356eb9e25f3eaef31623c4a66478f356828e257f4725f4f" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.497681 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b"] Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.497690 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1fd7654e1547b696356eb9e25f3eaef31623c4a66478f356828e257f4725f4f"} err="failed to get container status \"e1fd7654e1547b696356eb9e25f3eaef31623c4a66478f356828e257f4725f4f\": rpc error: code = NotFound desc = could not find container \"e1fd7654e1547b696356eb9e25f3eaef31623c4a66478f356828e257f4725f4f\": container with ID starting with e1fd7654e1547b696356eb9e25f3eaef31623c4a66478f356828e257f4725f4f not found: ID does not exist" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.497708 4776 scope.go:117] "RemoveContainer" containerID="f647a946cfc0edec4d9245ad0d9fe8df27b9dcede1b57c82239afb3122c6392f" Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.505078 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9ll9b"] Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.511737 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ld6f6"] Dec 08 09:04:10 crc kubenswrapper[4776]: I1208 09:04:10.516611 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ld6f6"] Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.417707 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-569854d77f-5hxvp"] Dec 08 09:04:11 crc kubenswrapper[4776]: E1208 09:04:11.417938 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.417950 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 08 09:04:11 crc kubenswrapper[4776]: E1208 09:04:11.417958 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c04832-2cf3-4401-bf58-b2b5624e5c97" containerName="route-controller-manager" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.417965 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c04832-2cf3-4401-bf58-b2b5624e5c97" containerName="route-controller-manager" Dec 08 09:04:11 crc kubenswrapper[4776]: E1208 09:04:11.417982 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da3b83e-efc3-4e6d-b876-186f430d3d77" containerName="controller-manager" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.417988 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da3b83e-efc3-4e6d-b876-186f430d3d77" containerName="controller-manager" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.418086 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2c04832-2cf3-4401-bf58-b2b5624e5c97" containerName="route-controller-manager" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.418099 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.418108 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da3b83e-efc3-4e6d-b876-186f430d3d77" containerName="controller-manager" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.418489 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.421010 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.421237 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.421344 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.422422 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.424163 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.425921 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl"] Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.426797 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.427834 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.428021 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.430437 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.434324 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl"] Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.440763 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-569854d77f-5hxvp"] Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.444965 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.445070 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.445231 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.445328 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.445399 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.527646 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-config\") pod \"controller-manager-569854d77f-5hxvp\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.527682 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-client-ca\") pod \"controller-manager-569854d77f-5hxvp\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.527750 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-657rm\" (UniqueName: \"kubernetes.io/projected/8e616fe3-8a21-4465-8ec2-2ed32269426c-kube-api-access-657rm\") pod \"controller-manager-569854d77f-5hxvp\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.527775 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pvft\" (UniqueName: \"kubernetes.io/projected/5370de19-5ec7-4255-922e-d1bcf45cc35b-kube-api-access-9pvft\") pod \"route-controller-manager-66649d5fd7-tvsfl\" (UID: \"5370de19-5ec7-4255-922e-d1bcf45cc35b\") " pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.527797 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5370de19-5ec7-4255-922e-d1bcf45cc35b-client-ca\") pod \"route-controller-manager-66649d5fd7-tvsfl\" (UID: \"5370de19-5ec7-4255-922e-d1bcf45cc35b\") " pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.527815 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5370de19-5ec7-4255-922e-d1bcf45cc35b-config\") pod \"route-controller-manager-66649d5fd7-tvsfl\" (UID: \"5370de19-5ec7-4255-922e-d1bcf45cc35b\") " pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.527991 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-proxy-ca-bundles\") pod \"controller-manager-569854d77f-5hxvp\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.528097 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e616fe3-8a21-4465-8ec2-2ed32269426c-serving-cert\") pod \"controller-manager-569854d77f-5hxvp\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.528127 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5370de19-5ec7-4255-922e-d1bcf45cc35b-serving-cert\") pod \"route-controller-manager-66649d5fd7-tvsfl\" (UID: \"5370de19-5ec7-4255-922e-d1bcf45cc35b\") " pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.629738 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5370de19-5ec7-4255-922e-d1bcf45cc35b-serving-cert\") pod \"route-controller-manager-66649d5fd7-tvsfl\" (UID: \"5370de19-5ec7-4255-922e-d1bcf45cc35b\") " pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.629813 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-config\") pod \"controller-manager-569854d77f-5hxvp\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.629833 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-client-ca\") pod \"controller-manager-569854d77f-5hxvp\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.629869 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-657rm\" (UniqueName: \"kubernetes.io/projected/8e616fe3-8a21-4465-8ec2-2ed32269426c-kube-api-access-657rm\") pod \"controller-manager-569854d77f-5hxvp\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.629898 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pvft\" (UniqueName: \"kubernetes.io/projected/5370de19-5ec7-4255-922e-d1bcf45cc35b-kube-api-access-9pvft\") pod \"route-controller-manager-66649d5fd7-tvsfl\" (UID: \"5370de19-5ec7-4255-922e-d1bcf45cc35b\") " pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.629916 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5370de19-5ec7-4255-922e-d1bcf45cc35b-client-ca\") pod \"route-controller-manager-66649d5fd7-tvsfl\" (UID: \"5370de19-5ec7-4255-922e-d1bcf45cc35b\") " pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.629938 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5370de19-5ec7-4255-922e-d1bcf45cc35b-config\") pod \"route-controller-manager-66649d5fd7-tvsfl\" (UID: \"5370de19-5ec7-4255-922e-d1bcf45cc35b\") " pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.629966 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-proxy-ca-bundles\") pod \"controller-manager-569854d77f-5hxvp\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.630005 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e616fe3-8a21-4465-8ec2-2ed32269426c-serving-cert\") pod \"controller-manager-569854d77f-5hxvp\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.631113 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5370de19-5ec7-4255-922e-d1bcf45cc35b-client-ca\") pod \"route-controller-manager-66649d5fd7-tvsfl\" (UID: \"5370de19-5ec7-4255-922e-d1bcf45cc35b\") " pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.631454 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-client-ca\") pod \"controller-manager-569854d77f-5hxvp\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.631536 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-config\") pod \"controller-manager-569854d77f-5hxvp\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.631802 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5370de19-5ec7-4255-922e-d1bcf45cc35b-config\") pod \"route-controller-manager-66649d5fd7-tvsfl\" (UID: \"5370de19-5ec7-4255-922e-d1bcf45cc35b\") " pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.632555 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-proxy-ca-bundles\") pod \"controller-manager-569854d77f-5hxvp\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.638951 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5370de19-5ec7-4255-922e-d1bcf45cc35b-serving-cert\") pod \"route-controller-manager-66649d5fd7-tvsfl\" (UID: \"5370de19-5ec7-4255-922e-d1bcf45cc35b\") " pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.640492 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e616fe3-8a21-4465-8ec2-2ed32269426c-serving-cert\") pod \"controller-manager-569854d77f-5hxvp\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.650306 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pvft\" (UniqueName: \"kubernetes.io/projected/5370de19-5ec7-4255-922e-d1bcf45cc35b-kube-api-access-9pvft\") pod \"route-controller-manager-66649d5fd7-tvsfl\" (UID: \"5370de19-5ec7-4255-922e-d1bcf45cc35b\") " pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.652474 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-657rm\" (UniqueName: \"kubernetes.io/projected/8e616fe3-8a21-4465-8ec2-2ed32269426c-kube-api-access-657rm\") pod \"controller-manager-569854d77f-5hxvp\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.750612 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.757586 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:04:11 crc kubenswrapper[4776]: I1208 09:04:11.950147 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl"] Dec 08 09:04:12 crc kubenswrapper[4776]: I1208 09:04:12.009965 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-569854d77f-5hxvp"] Dec 08 09:04:12 crc kubenswrapper[4776]: W1208 09:04:12.021890 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e616fe3_8a21_4465_8ec2_2ed32269426c.slice/crio-024fec29b9b802cdf366cdec263e200d44dd2dfb587301a94f72e504a266f139 WatchSource:0}: Error finding container 024fec29b9b802cdf366cdec263e200d44dd2dfb587301a94f72e504a266f139: Status 404 returned error can't find the container with id 024fec29b9b802cdf366cdec263e200d44dd2dfb587301a94f72e504a266f139 Dec 08 09:04:12 crc kubenswrapper[4776]: I1208 09:04:12.355378 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0da3b83e-efc3-4e6d-b876-186f430d3d77" path="/var/lib/kubelet/pods/0da3b83e-efc3-4e6d-b876-186f430d3d77/volumes" Dec 08 09:04:12 crc kubenswrapper[4776]: I1208 09:04:12.356149 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2c04832-2cf3-4401-bf58-b2b5624e5c97" path="/var/lib/kubelet/pods/c2c04832-2cf3-4401-bf58-b2b5624e5c97/volumes" Dec 08 09:04:12 crc kubenswrapper[4776]: I1208 09:04:12.490759 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" event={"ID":"8e616fe3-8a21-4465-8ec2-2ed32269426c","Type":"ContainerStarted","Data":"ef1ab10cec87da541daeace3a75a63e515b6a56ab9584622188734ebf1f7989f"} Dec 08 09:04:12 crc kubenswrapper[4776]: I1208 09:04:12.490804 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" event={"ID":"8e616fe3-8a21-4465-8ec2-2ed32269426c","Type":"ContainerStarted","Data":"024fec29b9b802cdf366cdec263e200d44dd2dfb587301a94f72e504a266f139"} Dec 08 09:04:12 crc kubenswrapper[4776]: I1208 09:04:12.491359 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:12 crc kubenswrapper[4776]: I1208 09:04:12.492237 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" event={"ID":"5370de19-5ec7-4255-922e-d1bcf45cc35b","Type":"ContainerStarted","Data":"fd9f20868cf486ba60ce125f25b17c1fa0976c75c304e3cd65ce4efb0c659279"} Dec 08 09:04:12 crc kubenswrapper[4776]: I1208 09:04:12.492266 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" event={"ID":"5370de19-5ec7-4255-922e-d1bcf45cc35b","Type":"ContainerStarted","Data":"1848625e4dd5a75671103cf97a1f5bf0802c97a37071204955c252536549d7bf"} Dec 08 09:04:12 crc kubenswrapper[4776]: I1208 09:04:12.492642 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:04:12 crc kubenswrapper[4776]: I1208 09:04:12.495520 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:04:12 crc kubenswrapper[4776]: I1208 09:04:12.508613 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" podStartSLOduration=3.5085941739999997 podStartE2EDuration="3.508594174s" podCreationTimestamp="2025-12-08 09:04:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:04:12.505944432 +0000 UTC m=+328.769169454" watchObservedRunningTime="2025-12-08 09:04:12.508594174 +0000 UTC m=+328.771819196" Dec 08 09:04:12 crc kubenswrapper[4776]: I1208 09:04:12.558674 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" podStartSLOduration=3.558655104 podStartE2EDuration="3.558655104s" podCreationTimestamp="2025-12-08 09:04:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:04:12.554405528 +0000 UTC m=+328.817630580" watchObservedRunningTime="2025-12-08 09:04:12.558655104 +0000 UTC m=+328.821880136" Dec 08 09:04:13 crc kubenswrapper[4776]: I1208 09:04:13.075149 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:04:41 crc kubenswrapper[4776]: I1208 09:04:41.399229 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:04:41 crc kubenswrapper[4776]: I1208 09:04:41.399659 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:05:09 crc kubenswrapper[4776]: I1208 09:05:09.257952 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl"] Dec 08 09:05:09 crc kubenswrapper[4776]: I1208 09:05:09.258872 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" podUID="5370de19-5ec7-4255-922e-d1bcf45cc35b" containerName="route-controller-manager" containerID="cri-o://fd9f20868cf486ba60ce125f25b17c1fa0976c75c304e3cd65ce4efb0c659279" gracePeriod=30 Dec 08 09:05:09 crc kubenswrapper[4776]: I1208 09:05:09.789610 4776 generic.go:334] "Generic (PLEG): container finished" podID="5370de19-5ec7-4255-922e-d1bcf45cc35b" containerID="fd9f20868cf486ba60ce125f25b17c1fa0976c75c304e3cd65ce4efb0c659279" exitCode=0 Dec 08 09:05:09 crc kubenswrapper[4776]: I1208 09:05:09.789652 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" event={"ID":"5370de19-5ec7-4255-922e-d1bcf45cc35b","Type":"ContainerDied","Data":"fd9f20868cf486ba60ce125f25b17c1fa0976c75c304e3cd65ce4efb0c659279"} Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.360212 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.383557 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7"] Dec 08 09:05:10 crc kubenswrapper[4776]: E1208 09:05:10.383808 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5370de19-5ec7-4255-922e-d1bcf45cc35b" containerName="route-controller-manager" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.383831 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5370de19-5ec7-4255-922e-d1bcf45cc35b" containerName="route-controller-manager" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.383954 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5370de19-5ec7-4255-922e-d1bcf45cc35b" containerName="route-controller-manager" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.387646 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.396527 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7"] Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.538056 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5370de19-5ec7-4255-922e-d1bcf45cc35b-serving-cert\") pod \"5370de19-5ec7-4255-922e-d1bcf45cc35b\" (UID: \"5370de19-5ec7-4255-922e-d1bcf45cc35b\") " Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.538118 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5370de19-5ec7-4255-922e-d1bcf45cc35b-config\") pod \"5370de19-5ec7-4255-922e-d1bcf45cc35b\" (UID: \"5370de19-5ec7-4255-922e-d1bcf45cc35b\") " Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.538190 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5370de19-5ec7-4255-922e-d1bcf45cc35b-client-ca\") pod \"5370de19-5ec7-4255-922e-d1bcf45cc35b\" (UID: \"5370de19-5ec7-4255-922e-d1bcf45cc35b\") " Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.538243 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pvft\" (UniqueName: \"kubernetes.io/projected/5370de19-5ec7-4255-922e-d1bcf45cc35b-kube-api-access-9pvft\") pod \"5370de19-5ec7-4255-922e-d1bcf45cc35b\" (UID: \"5370de19-5ec7-4255-922e-d1bcf45cc35b\") " Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.538443 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpmsc\" (UniqueName: \"kubernetes.io/projected/beafd5c3-369d-4603-bb17-602fe9855a1e-kube-api-access-lpmsc\") pod \"route-controller-manager-b66f8b55f-4qgf7\" (UID: \"beafd5c3-369d-4603-bb17-602fe9855a1e\") " pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.538608 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beafd5c3-369d-4603-bb17-602fe9855a1e-config\") pod \"route-controller-manager-b66f8b55f-4qgf7\" (UID: \"beafd5c3-369d-4603-bb17-602fe9855a1e\") " pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.538671 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beafd5c3-369d-4603-bb17-602fe9855a1e-client-ca\") pod \"route-controller-manager-b66f8b55f-4qgf7\" (UID: \"beafd5c3-369d-4603-bb17-602fe9855a1e\") " pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.538704 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beafd5c3-369d-4603-bb17-602fe9855a1e-serving-cert\") pod \"route-controller-manager-b66f8b55f-4qgf7\" (UID: \"beafd5c3-369d-4603-bb17-602fe9855a1e\") " pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.539729 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5370de19-5ec7-4255-922e-d1bcf45cc35b-client-ca" (OuterVolumeSpecName: "client-ca") pod "5370de19-5ec7-4255-922e-d1bcf45cc35b" (UID: "5370de19-5ec7-4255-922e-d1bcf45cc35b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.539892 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5370de19-5ec7-4255-922e-d1bcf45cc35b-config" (OuterVolumeSpecName: "config") pod "5370de19-5ec7-4255-922e-d1bcf45cc35b" (UID: "5370de19-5ec7-4255-922e-d1bcf45cc35b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.543688 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5370de19-5ec7-4255-922e-d1bcf45cc35b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5370de19-5ec7-4255-922e-d1bcf45cc35b" (UID: "5370de19-5ec7-4255-922e-d1bcf45cc35b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.543996 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5370de19-5ec7-4255-922e-d1bcf45cc35b-kube-api-access-9pvft" (OuterVolumeSpecName: "kube-api-access-9pvft") pod "5370de19-5ec7-4255-922e-d1bcf45cc35b" (UID: "5370de19-5ec7-4255-922e-d1bcf45cc35b"). InnerVolumeSpecName "kube-api-access-9pvft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.640107 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beafd5c3-369d-4603-bb17-602fe9855a1e-client-ca\") pod \"route-controller-manager-b66f8b55f-4qgf7\" (UID: \"beafd5c3-369d-4603-bb17-602fe9855a1e\") " pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.640202 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beafd5c3-369d-4603-bb17-602fe9855a1e-serving-cert\") pod \"route-controller-manager-b66f8b55f-4qgf7\" (UID: \"beafd5c3-369d-4603-bb17-602fe9855a1e\") " pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.640235 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpmsc\" (UniqueName: \"kubernetes.io/projected/beafd5c3-369d-4603-bb17-602fe9855a1e-kube-api-access-lpmsc\") pod \"route-controller-manager-b66f8b55f-4qgf7\" (UID: \"beafd5c3-369d-4603-bb17-602fe9855a1e\") " pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.640298 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beafd5c3-369d-4603-bb17-602fe9855a1e-config\") pod \"route-controller-manager-b66f8b55f-4qgf7\" (UID: \"beafd5c3-369d-4603-bb17-602fe9855a1e\") " pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.641485 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pvft\" (UniqueName: \"kubernetes.io/projected/5370de19-5ec7-4255-922e-d1bcf45cc35b-kube-api-access-9pvft\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.641544 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5370de19-5ec7-4255-922e-d1bcf45cc35b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.641559 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5370de19-5ec7-4255-922e-d1bcf45cc35b-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.641585 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5370de19-5ec7-4255-922e-d1bcf45cc35b-client-ca\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.642295 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beafd5c3-369d-4603-bb17-602fe9855a1e-client-ca\") pod \"route-controller-manager-b66f8b55f-4qgf7\" (UID: \"beafd5c3-369d-4603-bb17-602fe9855a1e\") " pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.643308 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beafd5c3-369d-4603-bb17-602fe9855a1e-config\") pod \"route-controller-manager-b66f8b55f-4qgf7\" (UID: \"beafd5c3-369d-4603-bb17-602fe9855a1e\") " pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.646104 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beafd5c3-369d-4603-bb17-602fe9855a1e-serving-cert\") pod \"route-controller-manager-b66f8b55f-4qgf7\" (UID: \"beafd5c3-369d-4603-bb17-602fe9855a1e\") " pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.661283 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpmsc\" (UniqueName: \"kubernetes.io/projected/beafd5c3-369d-4603-bb17-602fe9855a1e-kube-api-access-lpmsc\") pod \"route-controller-manager-b66f8b55f-4qgf7\" (UID: \"beafd5c3-369d-4603-bb17-602fe9855a1e\") " pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.714845 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.802697 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" event={"ID":"5370de19-5ec7-4255-922e-d1bcf45cc35b","Type":"ContainerDied","Data":"1848625e4dd5a75671103cf97a1f5bf0802c97a37071204955c252536549d7bf"} Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.803030 4776 scope.go:117] "RemoveContainer" containerID="fd9f20868cf486ba60ce125f25b17c1fa0976c75c304e3cd65ce4efb0c659279" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.803187 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl" Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.829700 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl"] Dec 08 09:05:10 crc kubenswrapper[4776]: I1208 09:05:10.833534 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66649d5fd7-tvsfl"] Dec 08 09:05:11 crc kubenswrapper[4776]: I1208 09:05:11.097979 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7"] Dec 08 09:05:11 crc kubenswrapper[4776]: I1208 09:05:11.398901 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:05:11 crc kubenswrapper[4776]: I1208 09:05:11.398975 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:05:11 crc kubenswrapper[4776]: I1208 09:05:11.810084 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" event={"ID":"beafd5c3-369d-4603-bb17-602fe9855a1e","Type":"ContainerStarted","Data":"d205b9a1f3fc665db8545b0b5c453dc1d40ddb5c448e9a6a22feb1e451c72171"} Dec 08 09:05:12 crc kubenswrapper[4776]: I1208 09:05:12.356769 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5370de19-5ec7-4255-922e-d1bcf45cc35b" path="/var/lib/kubelet/pods/5370de19-5ec7-4255-922e-d1bcf45cc35b/volumes" Dec 08 09:05:12 crc kubenswrapper[4776]: I1208 09:05:12.816785 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" event={"ID":"beafd5c3-369d-4603-bb17-602fe9855a1e","Type":"ContainerStarted","Data":"e1abc16e9ea64f4891eeeebe7876d1246ed49d24db95e76dc39a384ac9bb760d"} Dec 08 09:05:12 crc kubenswrapper[4776]: I1208 09:05:12.817161 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" Dec 08 09:05:12 crc kubenswrapper[4776]: I1208 09:05:12.822968 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" Dec 08 09:05:12 crc kubenswrapper[4776]: I1208 09:05:12.832070 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b66f8b55f-4qgf7" podStartSLOduration=3.832055428 podStartE2EDuration="3.832055428s" podCreationTimestamp="2025-12-08 09:05:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:05:12.830744323 +0000 UTC m=+389.093969355" watchObservedRunningTime="2025-12-08 09:05:12.832055428 +0000 UTC m=+389.095280450" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.498437 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f9l58"] Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.499073 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.509343 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f9l58"] Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.680667 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfckr\" (UniqueName: \"kubernetes.io/projected/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-kube-api-access-jfckr\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.680956 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.681000 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.681051 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-bound-sa-token\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.681153 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-trusted-ca\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.681209 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-registry-certificates\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.681273 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-registry-tls\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.681291 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.699985 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.782404 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-bound-sa-token\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.782491 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-trusted-ca\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.782522 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-registry-certificates\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.782560 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-registry-tls\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.782584 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.782609 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfckr\" (UniqueName: \"kubernetes.io/projected/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-kube-api-access-jfckr\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.782656 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.783156 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.783697 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-trusted-ca\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.784934 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-registry-certificates\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.788606 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-registry-tls\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.788975 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.799037 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfckr\" (UniqueName: \"kubernetes.io/projected/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-kube-api-access-jfckr\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.799503 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebc6e8ea-644d-441b-a46a-a6e75eb667b1-bound-sa-token\") pod \"image-registry-66df7c8f76-f9l58\" (UID: \"ebc6e8ea-644d-441b-a46a-a6e75eb667b1\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:13 crc kubenswrapper[4776]: I1208 09:05:13.815707 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:14 crc kubenswrapper[4776]: I1208 09:05:14.265806 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f9l58"] Dec 08 09:05:14 crc kubenswrapper[4776]: I1208 09:05:14.827473 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" event={"ID":"ebc6e8ea-644d-441b-a46a-a6e75eb667b1","Type":"ContainerStarted","Data":"8f072d77ea4d33033b6f80216a31ce22ade0ba8e7f9b7754e632bb808d08cb0d"} Dec 08 09:05:14 crc kubenswrapper[4776]: I1208 09:05:14.827803 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" event={"ID":"ebc6e8ea-644d-441b-a46a-a6e75eb667b1","Type":"ContainerStarted","Data":"2c6b7b241daa3d5ab966af674377c004cc58d56e9626b2339143c4f174135aad"} Dec 08 09:05:14 crc kubenswrapper[4776]: I1208 09:05:14.845661 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" podStartSLOduration=1.845641455 podStartE2EDuration="1.845641455s" podCreationTimestamp="2025-12-08 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:05:14.841907373 +0000 UTC m=+391.105132415" watchObservedRunningTime="2025-12-08 09:05:14.845641455 +0000 UTC m=+391.108866467" Dec 08 09:05:15 crc kubenswrapper[4776]: I1208 09:05:15.833127 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:29 crc kubenswrapper[4776]: I1208 09:05:29.283322 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-569854d77f-5hxvp"] Dec 08 09:05:29 crc kubenswrapper[4776]: I1208 09:05:29.284226 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" podUID="8e616fe3-8a21-4465-8ec2-2ed32269426c" containerName="controller-manager" containerID="cri-o://ef1ab10cec87da541daeace3a75a63e515b6a56ab9584622188734ebf1f7989f" gracePeriod=30 Dec 08 09:05:31 crc kubenswrapper[4776]: I1208 09:05:31.751242 4776 patch_prober.go:28] interesting pod/controller-manager-569854d77f-5hxvp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Dec 08 09:05:31 crc kubenswrapper[4776]: I1208 09:05:31.751316 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" podUID="8e616fe3-8a21-4465-8ec2-2ed32269426c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Dec 08 09:05:32 crc kubenswrapper[4776]: I1208 09:05:32.950606 4776 generic.go:334] "Generic (PLEG): container finished" podID="8e616fe3-8a21-4465-8ec2-2ed32269426c" containerID="ef1ab10cec87da541daeace3a75a63e515b6a56ab9584622188734ebf1f7989f" exitCode=0 Dec 08 09:05:32 crc kubenswrapper[4776]: I1208 09:05:32.950648 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" event={"ID":"8e616fe3-8a21-4465-8ec2-2ed32269426c","Type":"ContainerDied","Data":"ef1ab10cec87da541daeace3a75a63e515b6a56ab9584622188734ebf1f7989f"} Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.613033 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.644284 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr"] Dec 08 09:05:33 crc kubenswrapper[4776]: E1208 09:05:33.644590 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e616fe3-8a21-4465-8ec2-2ed32269426c" containerName="controller-manager" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.644604 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e616fe3-8a21-4465-8ec2-2ed32269426c" containerName="controller-manager" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.644715 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e616fe3-8a21-4465-8ec2-2ed32269426c" containerName="controller-manager" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.645098 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.647449 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr"] Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.783092 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-657rm\" (UniqueName: \"kubernetes.io/projected/8e616fe3-8a21-4465-8ec2-2ed32269426c-kube-api-access-657rm\") pod \"8e616fe3-8a21-4465-8ec2-2ed32269426c\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.783151 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-client-ca\") pod \"8e616fe3-8a21-4465-8ec2-2ed32269426c\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.783194 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-config\") pod \"8e616fe3-8a21-4465-8ec2-2ed32269426c\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.783229 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e616fe3-8a21-4465-8ec2-2ed32269426c-serving-cert\") pod \"8e616fe3-8a21-4465-8ec2-2ed32269426c\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.783246 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-proxy-ca-bundles\") pod \"8e616fe3-8a21-4465-8ec2-2ed32269426c\" (UID: \"8e616fe3-8a21-4465-8ec2-2ed32269426c\") " Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.783403 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500f3b13-d526-4ab8-872a-a8fe78aecc16-config\") pod \"controller-manager-7bf4cf4665-5n2qr\" (UID: \"500f3b13-d526-4ab8-872a-a8fe78aecc16\") " pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.783450 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tnzp\" (UniqueName: \"kubernetes.io/projected/500f3b13-d526-4ab8-872a-a8fe78aecc16-kube-api-access-5tnzp\") pod \"controller-manager-7bf4cf4665-5n2qr\" (UID: \"500f3b13-d526-4ab8-872a-a8fe78aecc16\") " pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.783476 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/500f3b13-d526-4ab8-872a-a8fe78aecc16-client-ca\") pod \"controller-manager-7bf4cf4665-5n2qr\" (UID: \"500f3b13-d526-4ab8-872a-a8fe78aecc16\") " pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.783515 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/500f3b13-d526-4ab8-872a-a8fe78aecc16-proxy-ca-bundles\") pod \"controller-manager-7bf4cf4665-5n2qr\" (UID: \"500f3b13-d526-4ab8-872a-a8fe78aecc16\") " pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.783533 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/500f3b13-d526-4ab8-872a-a8fe78aecc16-serving-cert\") pod \"controller-manager-7bf4cf4665-5n2qr\" (UID: \"500f3b13-d526-4ab8-872a-a8fe78aecc16\") " pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.784581 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-client-ca" (OuterVolumeSpecName: "client-ca") pod "8e616fe3-8a21-4465-8ec2-2ed32269426c" (UID: "8e616fe3-8a21-4465-8ec2-2ed32269426c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.784621 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-config" (OuterVolumeSpecName: "config") pod "8e616fe3-8a21-4465-8ec2-2ed32269426c" (UID: "8e616fe3-8a21-4465-8ec2-2ed32269426c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.785208 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8e616fe3-8a21-4465-8ec2-2ed32269426c" (UID: "8e616fe3-8a21-4465-8ec2-2ed32269426c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.789294 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e616fe3-8a21-4465-8ec2-2ed32269426c-kube-api-access-657rm" (OuterVolumeSpecName: "kube-api-access-657rm") pod "8e616fe3-8a21-4465-8ec2-2ed32269426c" (UID: "8e616fe3-8a21-4465-8ec2-2ed32269426c"). InnerVolumeSpecName "kube-api-access-657rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.789891 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e616fe3-8a21-4465-8ec2-2ed32269426c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8e616fe3-8a21-4465-8ec2-2ed32269426c" (UID: "8e616fe3-8a21-4465-8ec2-2ed32269426c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.821903 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-f9l58" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.867383 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p5sqv"] Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.884831 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tnzp\" (UniqueName: \"kubernetes.io/projected/500f3b13-d526-4ab8-872a-a8fe78aecc16-kube-api-access-5tnzp\") pod \"controller-manager-7bf4cf4665-5n2qr\" (UID: \"500f3b13-d526-4ab8-872a-a8fe78aecc16\") " pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.885188 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/500f3b13-d526-4ab8-872a-a8fe78aecc16-client-ca\") pod \"controller-manager-7bf4cf4665-5n2qr\" (UID: \"500f3b13-d526-4ab8-872a-a8fe78aecc16\") " pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.885269 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/500f3b13-d526-4ab8-872a-a8fe78aecc16-proxy-ca-bundles\") pod \"controller-manager-7bf4cf4665-5n2qr\" (UID: \"500f3b13-d526-4ab8-872a-a8fe78aecc16\") " pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.885305 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/500f3b13-d526-4ab8-872a-a8fe78aecc16-serving-cert\") pod \"controller-manager-7bf4cf4665-5n2qr\" (UID: \"500f3b13-d526-4ab8-872a-a8fe78aecc16\") " pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.885396 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500f3b13-d526-4ab8-872a-a8fe78aecc16-config\") pod \"controller-manager-7bf4cf4665-5n2qr\" (UID: \"500f3b13-d526-4ab8-872a-a8fe78aecc16\") " pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.885483 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-657rm\" (UniqueName: \"kubernetes.io/projected/8e616fe3-8a21-4465-8ec2-2ed32269426c-kube-api-access-657rm\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.885498 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.885510 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.885520 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e616fe3-8a21-4465-8ec2-2ed32269426c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.885531 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e616fe3-8a21-4465-8ec2-2ed32269426c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.886871 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500f3b13-d526-4ab8-872a-a8fe78aecc16-config\") pod \"controller-manager-7bf4cf4665-5n2qr\" (UID: \"500f3b13-d526-4ab8-872a-a8fe78aecc16\") " pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.888986 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/500f3b13-d526-4ab8-872a-a8fe78aecc16-client-ca\") pod \"controller-manager-7bf4cf4665-5n2qr\" (UID: \"500f3b13-d526-4ab8-872a-a8fe78aecc16\") " pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.889204 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/500f3b13-d526-4ab8-872a-a8fe78aecc16-proxy-ca-bundles\") pod \"controller-manager-7bf4cf4665-5n2qr\" (UID: \"500f3b13-d526-4ab8-872a-a8fe78aecc16\") " pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.889373 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/500f3b13-d526-4ab8-872a-a8fe78aecc16-serving-cert\") pod \"controller-manager-7bf4cf4665-5n2qr\" (UID: \"500f3b13-d526-4ab8-872a-a8fe78aecc16\") " pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.901781 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tnzp\" (UniqueName: \"kubernetes.io/projected/500f3b13-d526-4ab8-872a-a8fe78aecc16-kube-api-access-5tnzp\") pod \"controller-manager-7bf4cf4665-5n2qr\" (UID: \"500f3b13-d526-4ab8-872a-a8fe78aecc16\") " pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.956054 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" event={"ID":"8e616fe3-8a21-4465-8ec2-2ed32269426c","Type":"ContainerDied","Data":"024fec29b9b802cdf366cdec263e200d44dd2dfb587301a94f72e504a266f139"} Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.956099 4776 scope.go:117] "RemoveContainer" containerID="ef1ab10cec87da541daeace3a75a63e515b6a56ab9584622188734ebf1f7989f" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.956203 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-569854d77f-5hxvp" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.975086 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.995622 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-569854d77f-5hxvp"] Dec 08 09:05:33 crc kubenswrapper[4776]: I1208 09:05:33.998480 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-569854d77f-5hxvp"] Dec 08 09:05:34 crc kubenswrapper[4776]: I1208 09:05:34.355362 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e616fe3-8a21-4465-8ec2-2ed32269426c" path="/var/lib/kubelet/pods/8e616fe3-8a21-4465-8ec2-2ed32269426c/volumes" Dec 08 09:05:34 crc kubenswrapper[4776]: I1208 09:05:34.356193 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr"] Dec 08 09:05:34 crc kubenswrapper[4776]: W1208 09:05:34.356377 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod500f3b13_d526_4ab8_872a_a8fe78aecc16.slice/crio-ef1da93ba4a9d6c51815196e08d63fff797ef86ef69df654cda44a8dc07671b0 WatchSource:0}: Error finding container ef1da93ba4a9d6c51815196e08d63fff797ef86ef69df654cda44a8dc07671b0: Status 404 returned error can't find the container with id ef1da93ba4a9d6c51815196e08d63fff797ef86ef69df654cda44a8dc07671b0 Dec 08 09:05:34 crc kubenswrapper[4776]: I1208 09:05:34.962965 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" event={"ID":"500f3b13-d526-4ab8-872a-a8fe78aecc16","Type":"ContainerStarted","Data":"ef1da93ba4a9d6c51815196e08d63fff797ef86ef69df654cda44a8dc07671b0"} Dec 08 09:05:35 crc kubenswrapper[4776]: I1208 09:05:35.969763 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" event={"ID":"500f3b13-d526-4ab8-872a-a8fe78aecc16","Type":"ContainerStarted","Data":"1fa3105d98ab5dd011922ddf97b807e821bc52c1392371f763cd7dedeb075a15"} Dec 08 09:05:35 crc kubenswrapper[4776]: I1208 09:05:35.970046 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:35 crc kubenswrapper[4776]: I1208 09:05:35.974961 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" Dec 08 09:05:35 crc kubenswrapper[4776]: I1208 09:05:35.989338 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7bf4cf4665-5n2qr" podStartSLOduration=6.989317952 podStartE2EDuration="6.989317952s" podCreationTimestamp="2025-12-08 09:05:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:05:35.98816538 +0000 UTC m=+412.251390412" watchObservedRunningTime="2025-12-08 09:05:35.989317952 +0000 UTC m=+412.252543004" Dec 08 09:05:41 crc kubenswrapper[4776]: I1208 09:05:41.400918 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:05:41 crc kubenswrapper[4776]: I1208 09:05:41.401923 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:05:41 crc kubenswrapper[4776]: I1208 09:05:41.402008 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 09:05:41 crc kubenswrapper[4776]: I1208 09:05:41.404117 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3686f95c2750ae2f6fecf0ef1b9e49c85b6866553ae81497ae9ec17dd913386b"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 09:05:41 crc kubenswrapper[4776]: I1208 09:05:41.404559 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://3686f95c2750ae2f6fecf0ef1b9e49c85b6866553ae81497ae9ec17dd913386b" gracePeriod=600 Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.077009 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-swlvb"] Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.082665 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-swlvb" podUID="b6a339c1-f955-4a08-bba1-0df39a886324" containerName="registry-server" containerID="cri-o://baaa60cadffadcf80c904850964a4f0d880a28ef8278ad5161844973525e5b09" gracePeriod=30 Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.086241 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z2qsw"] Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.086606 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z2qsw" podUID="9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" containerName="registry-server" containerID="cri-o://9636a56e89e398e2143e9da6b6d1d18b4704c2e92b280289711701f9f8cf8881" gracePeriod=30 Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.109945 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jf2nl"] Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.110315 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" podUID="50e1cbc5-727f-42ca-881c-fdd0b07ca739" containerName="marketplace-operator" containerID="cri-o://726cc14fd07cdccd06b06e7db795eb13bf7b9ce8069a7c9f2f0bcdefd1c5a77c" gracePeriod=30 Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.124435 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-82j27"] Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.124726 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-82j27" podUID="0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" containerName="registry-server" containerID="cri-o://920717a9fa8482ea3eb51645c7938b76c2f77c2ff318e53fd9fa1d5fcc81ff83" gracePeriod=30 Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.127817 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbmf6"] Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.128167 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fbmf6" podUID="bde03b49-eb1e-4941-b49e-e361cb8d83f4" containerName="registry-server" containerID="cri-o://840533b00ccbf019c1b1ad45d63acac9dc28cdd233141ef50008afcdc65278f7" gracePeriod=30 Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.137720 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8ctxn"] Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.138305 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8ctxn"] Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.138388 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8ctxn" Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.153296 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b93e12f9-d5c1-4ee8-9786-85d352d62076-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8ctxn\" (UID: \"b93e12f9-d5c1-4ee8-9786-85d352d62076\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ctxn" Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.153332 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w27xs\" (UniqueName: \"kubernetes.io/projected/b93e12f9-d5c1-4ee8-9786-85d352d62076-kube-api-access-w27xs\") pod \"marketplace-operator-79b997595-8ctxn\" (UID: \"b93e12f9-d5c1-4ee8-9786-85d352d62076\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ctxn" Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.153425 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b93e12f9-d5c1-4ee8-9786-85d352d62076-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8ctxn\" (UID: \"b93e12f9-d5c1-4ee8-9786-85d352d62076\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ctxn" Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.254994 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b93e12f9-d5c1-4ee8-9786-85d352d62076-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8ctxn\" (UID: \"b93e12f9-d5c1-4ee8-9786-85d352d62076\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ctxn" Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.255033 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w27xs\" (UniqueName: \"kubernetes.io/projected/b93e12f9-d5c1-4ee8-9786-85d352d62076-kube-api-access-w27xs\") pod \"marketplace-operator-79b997595-8ctxn\" (UID: \"b93e12f9-d5c1-4ee8-9786-85d352d62076\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ctxn" Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.255061 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b93e12f9-d5c1-4ee8-9786-85d352d62076-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8ctxn\" (UID: \"b93e12f9-d5c1-4ee8-9786-85d352d62076\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ctxn" Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.256307 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b93e12f9-d5c1-4ee8-9786-85d352d62076-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8ctxn\" (UID: \"b93e12f9-d5c1-4ee8-9786-85d352d62076\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ctxn" Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.262499 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b93e12f9-d5c1-4ee8-9786-85d352d62076-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8ctxn\" (UID: \"b93e12f9-d5c1-4ee8-9786-85d352d62076\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ctxn" Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.283440 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w27xs\" (UniqueName: \"kubernetes.io/projected/b93e12f9-d5c1-4ee8-9786-85d352d62076-kube-api-access-w27xs\") pod \"marketplace-operator-79b997595-8ctxn\" (UID: \"b93e12f9-d5c1-4ee8-9786-85d352d62076\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ctxn" Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.466272 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8ctxn" Dec 08 09:05:45 crc kubenswrapper[4776]: I1208 09:05:45.943841 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8ctxn"] Dec 08 09:05:45 crc kubenswrapper[4776]: W1208 09:05:45.954967 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb93e12f9_d5c1_4ee8_9786_85d352d62076.slice/crio-092e3a80aaf38e125d34f8c2a29a0577d80b4465355409a5efe04287e1da4840 WatchSource:0}: Error finding container 092e3a80aaf38e125d34f8c2a29a0577d80b4465355409a5efe04287e1da4840: Status 404 returned error can't find the container with id 092e3a80aaf38e125d34f8c2a29a0577d80b4465355409a5efe04287e1da4840 Dec 08 09:05:46 crc kubenswrapper[4776]: I1208 09:05:46.024556 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8ctxn" event={"ID":"b93e12f9-d5c1-4ee8-9786-85d352d62076","Type":"ContainerStarted","Data":"092e3a80aaf38e125d34f8c2a29a0577d80b4465355409a5efe04287e1da4840"} Dec 08 09:05:47 crc kubenswrapper[4776]: I1208 09:05:47.862309 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="3686f95c2750ae2f6fecf0ef1b9e49c85b6866553ae81497ae9ec17dd913386b" exitCode=0 Dec 08 09:05:47 crc kubenswrapper[4776]: I1208 09:05:47.862382 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"3686f95c2750ae2f6fecf0ef1b9e49c85b6866553ae81497ae9ec17dd913386b"} Dec 08 09:05:47 crc kubenswrapper[4776]: I1208 09:05:47.862659 4776 scope.go:117] "RemoveContainer" containerID="860aab951ce65f5efd9e17e4d93c383ea508088d26c2f758bf8d7176d2fb96f8" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.196006 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.297643 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a339c1-f955-4a08-bba1-0df39a886324-utilities\") pod \"b6a339c1-f955-4a08-bba1-0df39a886324\" (UID: \"b6a339c1-f955-4a08-bba1-0df39a886324\") " Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.298017 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqkrn\" (UniqueName: \"kubernetes.io/projected/b6a339c1-f955-4a08-bba1-0df39a886324-kube-api-access-kqkrn\") pod \"b6a339c1-f955-4a08-bba1-0df39a886324\" (UID: \"b6a339c1-f955-4a08-bba1-0df39a886324\") " Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.298049 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a339c1-f955-4a08-bba1-0df39a886324-catalog-content\") pod \"b6a339c1-f955-4a08-bba1-0df39a886324\" (UID: \"b6a339c1-f955-4a08-bba1-0df39a886324\") " Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.298603 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6a339c1-f955-4a08-bba1-0df39a886324-utilities" (OuterVolumeSpecName: "utilities") pod "b6a339c1-f955-4a08-bba1-0df39a886324" (UID: "b6a339c1-f955-4a08-bba1-0df39a886324"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.305258 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6a339c1-f955-4a08-bba1-0df39a886324-kube-api-access-kqkrn" (OuterVolumeSpecName: "kube-api-access-kqkrn") pod "b6a339c1-f955-4a08-bba1-0df39a886324" (UID: "b6a339c1-f955-4a08-bba1-0df39a886324"). InnerVolumeSpecName "kube-api-access-kqkrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.347735 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6a339c1-f955-4a08-bba1-0df39a886324-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6a339c1-f955-4a08-bba1-0df39a886324" (UID: "b6a339c1-f955-4a08-bba1-0df39a886324"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.399466 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a339c1-f955-4a08-bba1-0df39a886324-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.399503 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqkrn\" (UniqueName: \"kubernetes.io/projected/b6a339c1-f955-4a08-bba1-0df39a886324-kube-api-access-kqkrn\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.399515 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a339c1-f955-4a08-bba1-0df39a886324-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.542349 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.552620 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.557810 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.569972 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.617532 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-catalog-content\") pod \"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac\" (UID: \"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac\") " Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.617856 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbt94\" (UniqueName: \"kubernetes.io/projected/bde03b49-eb1e-4941-b49e-e361cb8d83f4-kube-api-access-mbt94\") pod \"bde03b49-eb1e-4941-b49e-e361cb8d83f4\" (UID: \"bde03b49-eb1e-4941-b49e-e361cb8d83f4\") " Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.617889 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/50e1cbc5-727f-42ca-881c-fdd0b07ca739-marketplace-operator-metrics\") pod \"50e1cbc5-727f-42ca-881c-fdd0b07ca739\" (UID: \"50e1cbc5-727f-42ca-881c-fdd0b07ca739\") " Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.617911 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde03b49-eb1e-4941-b49e-e361cb8d83f4-catalog-content\") pod \"bde03b49-eb1e-4941-b49e-e361cb8d83f4\" (UID: \"bde03b49-eb1e-4941-b49e-e361cb8d83f4\") " Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.617933 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde03b49-eb1e-4941-b49e-e361cb8d83f4-utilities\") pod \"bde03b49-eb1e-4941-b49e-e361cb8d83f4\" (UID: \"bde03b49-eb1e-4941-b49e-e361cb8d83f4\") " Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.617960 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqkcs\" (UniqueName: \"kubernetes.io/projected/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-kube-api-access-fqkcs\") pod \"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9\" (UID: \"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9\") " Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.618015 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-catalog-content\") pod \"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9\" (UID: \"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9\") " Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.618047 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-utilities\") pod \"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9\" (UID: \"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9\") " Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.618075 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50e1cbc5-727f-42ca-881c-fdd0b07ca739-marketplace-trusted-ca\") pod \"50e1cbc5-727f-42ca-881c-fdd0b07ca739\" (UID: \"50e1cbc5-727f-42ca-881c-fdd0b07ca739\") " Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.618108 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpp6m\" (UniqueName: \"kubernetes.io/projected/50e1cbc5-727f-42ca-881c-fdd0b07ca739-kube-api-access-rpp6m\") pod \"50e1cbc5-727f-42ca-881c-fdd0b07ca739\" (UID: \"50e1cbc5-727f-42ca-881c-fdd0b07ca739\") " Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.618153 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-utilities\") pod \"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac\" (UID: \"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac\") " Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.618204 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj4sg\" (UniqueName: \"kubernetes.io/projected/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-kube-api-access-mj4sg\") pod \"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac\" (UID: \"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac\") " Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.621772 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50e1cbc5-727f-42ca-881c-fdd0b07ca739-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "50e1cbc5-727f-42ca-881c-fdd0b07ca739" (UID: "50e1cbc5-727f-42ca-881c-fdd0b07ca739"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.622398 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-utilities" (OuterVolumeSpecName: "utilities") pod "9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" (UID: "9f1cf0fc-eed0-4fba-8b89-b29bf78cadac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.626772 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-utilities" (OuterVolumeSpecName: "utilities") pod "0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" (UID: "0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.632048 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e1cbc5-727f-42ca-881c-fdd0b07ca739-kube-api-access-rpp6m" (OuterVolumeSpecName: "kube-api-access-rpp6m") pod "50e1cbc5-727f-42ca-881c-fdd0b07ca739" (UID: "50e1cbc5-727f-42ca-881c-fdd0b07ca739"). InnerVolumeSpecName "kube-api-access-rpp6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.632421 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-kube-api-access-mj4sg" (OuterVolumeSpecName: "kube-api-access-mj4sg") pod "9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" (UID: "9f1cf0fc-eed0-4fba-8b89-b29bf78cadac"). InnerVolumeSpecName "kube-api-access-mj4sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.634103 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde03b49-eb1e-4941-b49e-e361cb8d83f4-utilities" (OuterVolumeSpecName: "utilities") pod "bde03b49-eb1e-4941-b49e-e361cb8d83f4" (UID: "bde03b49-eb1e-4941-b49e-e361cb8d83f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.635424 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-kube-api-access-fqkcs" (OuterVolumeSpecName: "kube-api-access-fqkcs") pod "0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" (UID: "0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9"). InnerVolumeSpecName "kube-api-access-fqkcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.635536 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde03b49-eb1e-4941-b49e-e361cb8d83f4-kube-api-access-mbt94" (OuterVolumeSpecName: "kube-api-access-mbt94") pod "bde03b49-eb1e-4941-b49e-e361cb8d83f4" (UID: "bde03b49-eb1e-4941-b49e-e361cb8d83f4"). InnerVolumeSpecName "kube-api-access-mbt94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.650553 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" (UID: "0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.669633 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50e1cbc5-727f-42ca-881c-fdd0b07ca739-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "50e1cbc5-727f-42ca-881c-fdd0b07ca739" (UID: "50e1cbc5-727f-42ca-881c-fdd0b07ca739"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.676459 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" (UID: "9f1cf0fc-eed0-4fba-8b89-b29bf78cadac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.719368 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.719630 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.719697 4776 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50e1cbc5-727f-42ca-881c-fdd0b07ca739-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.719761 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpp6m\" (UniqueName: \"kubernetes.io/projected/50e1cbc5-727f-42ca-881c-fdd0b07ca739-kube-api-access-rpp6m\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.719864 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.719929 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj4sg\" (UniqueName: \"kubernetes.io/projected/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-kube-api-access-mj4sg\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.719996 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbt94\" (UniqueName: \"kubernetes.io/projected/bde03b49-eb1e-4941-b49e-e361cb8d83f4-kube-api-access-mbt94\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.720055 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.720798 4776 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/50e1cbc5-727f-42ca-881c-fdd0b07ca739-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.720871 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde03b49-eb1e-4941-b49e-e361cb8d83f4-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.720931 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqkcs\" (UniqueName: \"kubernetes.io/projected/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9-kube-api-access-fqkcs\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.748056 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde03b49-eb1e-4941-b49e-e361cb8d83f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bde03b49-eb1e-4941-b49e-e361cb8d83f4" (UID: "bde03b49-eb1e-4941-b49e-e361cb8d83f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.822478 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde03b49-eb1e-4941-b49e-e361cb8d83f4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.870797 4776 generic.go:334] "Generic (PLEG): container finished" podID="b6a339c1-f955-4a08-bba1-0df39a886324" containerID="baaa60cadffadcf80c904850964a4f0d880a28ef8278ad5161844973525e5b09" exitCode=0 Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.870863 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swlvb" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.870889 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swlvb" event={"ID":"b6a339c1-f955-4a08-bba1-0df39a886324","Type":"ContainerDied","Data":"baaa60cadffadcf80c904850964a4f0d880a28ef8278ad5161844973525e5b09"} Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.872410 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swlvb" event={"ID":"b6a339c1-f955-4a08-bba1-0df39a886324","Type":"ContainerDied","Data":"66e39df1ffbfbfdd34b59edf64ae391cf2aab8f8cd7f272946f40878b8617774"} Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.872442 4776 scope.go:117] "RemoveContainer" containerID="baaa60cadffadcf80c904850964a4f0d880a28ef8278ad5161844973525e5b09" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.875320 4776 generic.go:334] "Generic (PLEG): container finished" podID="bde03b49-eb1e-4941-b49e-e361cb8d83f4" containerID="840533b00ccbf019c1b1ad45d63acac9dc28cdd233141ef50008afcdc65278f7" exitCode=0 Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.875397 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbmf6" event={"ID":"bde03b49-eb1e-4941-b49e-e361cb8d83f4","Type":"ContainerDied","Data":"840533b00ccbf019c1b1ad45d63acac9dc28cdd233141ef50008afcdc65278f7"} Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.875433 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbmf6" event={"ID":"bde03b49-eb1e-4941-b49e-e361cb8d83f4","Type":"ContainerDied","Data":"8f9a0df32342894c6eff6c89a36d5563930510cf300e754f0ca9afb490a9e75c"} Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.876258 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbmf6" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.883996 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8ctxn" event={"ID":"b93e12f9-d5c1-4ee8-9786-85d352d62076","Type":"ContainerStarted","Data":"1f62a008de817b9623b068f88932d970ecda84f756dfa7a0b96d26117f7ba330"} Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.884554 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8ctxn" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.885598 4776 generic.go:334] "Generic (PLEG): container finished" podID="50e1cbc5-727f-42ca-881c-fdd0b07ca739" containerID="726cc14fd07cdccd06b06e7db795eb13bf7b9ce8069a7c9f2f0bcdefd1c5a77c" exitCode=0 Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.885644 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" event={"ID":"50e1cbc5-727f-42ca-881c-fdd0b07ca739","Type":"ContainerDied","Data":"726cc14fd07cdccd06b06e7db795eb13bf7b9ce8069a7c9f2f0bcdefd1c5a77c"} Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.885660 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" event={"ID":"50e1cbc5-727f-42ca-881c-fdd0b07ca739","Type":"ContainerDied","Data":"387f6c13d8ddc52c4b6f49d3f003b41a4e0e766c93075bd4c99a1c80a6d9fd5a"} Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.885701 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jf2nl" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.890018 4776 generic.go:334] "Generic (PLEG): container finished" podID="0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" containerID="920717a9fa8482ea3eb51645c7938b76c2f77c2ff318e53fd9fa1d5fcc81ff83" exitCode=0 Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.890088 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82j27" event={"ID":"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9","Type":"ContainerDied","Data":"920717a9fa8482ea3eb51645c7938b76c2f77c2ff318e53fd9fa1d5fcc81ff83"} Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.890115 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82j27" event={"ID":"0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9","Type":"ContainerDied","Data":"71b68a8eb858fd1b3e75bef1902efb78815ffec85a19de9d56cb17c0050ed8dd"} Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.890211 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82j27" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.893915 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8ctxn" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.898773 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"60dbb3e7c44241db89caa5cb2272dfcb89d62fdbf75c7153dbb476fd01b77752"} Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.900319 4776 scope.go:117] "RemoveContainer" containerID="f4302c6b15eecdb8e2b42d69c6614264c5c70265656ef272d30960e4ae20e4e3" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.900928 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-swlvb"] Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.905370 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-swlvb"] Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.906970 4776 generic.go:334] "Generic (PLEG): container finished" podID="9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" containerID="9636a56e89e398e2143e9da6b6d1d18b4704c2e92b280289711701f9f8cf8881" exitCode=0 Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.907010 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z2qsw" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.907018 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2qsw" event={"ID":"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac","Type":"ContainerDied","Data":"9636a56e89e398e2143e9da6b6d1d18b4704c2e92b280289711701f9f8cf8881"} Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.907050 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2qsw" event={"ID":"9f1cf0fc-eed0-4fba-8b89-b29bf78cadac","Type":"ContainerDied","Data":"360703b853412245904335459f733e558000226ff4569205758fc692bbdcc6c3"} Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.916824 4776 scope.go:117] "RemoveContainer" containerID="fbcb29d00241802beedb3901d3bbfa7f9e3d28561312424366078ac538370e4a" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.922031 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8ctxn" podStartSLOduration=3.922009971 podStartE2EDuration="3.922009971s" podCreationTimestamp="2025-12-08 09:05:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:05:48.91902032 +0000 UTC m=+425.182245362" watchObservedRunningTime="2025-12-08 09:05:48.922009971 +0000 UTC m=+425.185234993" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.946599 4776 scope.go:117] "RemoveContainer" containerID="baaa60cadffadcf80c904850964a4f0d880a28ef8278ad5161844973525e5b09" Dec 08 09:05:48 crc kubenswrapper[4776]: E1208 09:05:48.947288 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baaa60cadffadcf80c904850964a4f0d880a28ef8278ad5161844973525e5b09\": container with ID starting with baaa60cadffadcf80c904850964a4f0d880a28ef8278ad5161844973525e5b09 not found: ID does not exist" containerID="baaa60cadffadcf80c904850964a4f0d880a28ef8278ad5161844973525e5b09" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.947396 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baaa60cadffadcf80c904850964a4f0d880a28ef8278ad5161844973525e5b09"} err="failed to get container status \"baaa60cadffadcf80c904850964a4f0d880a28ef8278ad5161844973525e5b09\": rpc error: code = NotFound desc = could not find container \"baaa60cadffadcf80c904850964a4f0d880a28ef8278ad5161844973525e5b09\": container with ID starting with baaa60cadffadcf80c904850964a4f0d880a28ef8278ad5161844973525e5b09 not found: ID does not exist" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.947597 4776 scope.go:117] "RemoveContainer" containerID="f4302c6b15eecdb8e2b42d69c6614264c5c70265656ef272d30960e4ae20e4e3" Dec 08 09:05:48 crc kubenswrapper[4776]: E1208 09:05:48.950145 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4302c6b15eecdb8e2b42d69c6614264c5c70265656ef272d30960e4ae20e4e3\": container with ID starting with f4302c6b15eecdb8e2b42d69c6614264c5c70265656ef272d30960e4ae20e4e3 not found: ID does not exist" containerID="f4302c6b15eecdb8e2b42d69c6614264c5c70265656ef272d30960e4ae20e4e3" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.951883 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4302c6b15eecdb8e2b42d69c6614264c5c70265656ef272d30960e4ae20e4e3"} err="failed to get container status \"f4302c6b15eecdb8e2b42d69c6614264c5c70265656ef272d30960e4ae20e4e3\": rpc error: code = NotFound desc = could not find container \"f4302c6b15eecdb8e2b42d69c6614264c5c70265656ef272d30960e4ae20e4e3\": container with ID starting with f4302c6b15eecdb8e2b42d69c6614264c5c70265656ef272d30960e4ae20e4e3 not found: ID does not exist" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.951921 4776 scope.go:117] "RemoveContainer" containerID="fbcb29d00241802beedb3901d3bbfa7f9e3d28561312424366078ac538370e4a" Dec 08 09:05:48 crc kubenswrapper[4776]: E1208 09:05:48.952408 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbcb29d00241802beedb3901d3bbfa7f9e3d28561312424366078ac538370e4a\": container with ID starting with fbcb29d00241802beedb3901d3bbfa7f9e3d28561312424366078ac538370e4a not found: ID does not exist" containerID="fbcb29d00241802beedb3901d3bbfa7f9e3d28561312424366078ac538370e4a" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.952432 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbcb29d00241802beedb3901d3bbfa7f9e3d28561312424366078ac538370e4a"} err="failed to get container status \"fbcb29d00241802beedb3901d3bbfa7f9e3d28561312424366078ac538370e4a\": rpc error: code = NotFound desc = could not find container \"fbcb29d00241802beedb3901d3bbfa7f9e3d28561312424366078ac538370e4a\": container with ID starting with fbcb29d00241802beedb3901d3bbfa7f9e3d28561312424366078ac538370e4a not found: ID does not exist" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.952447 4776 scope.go:117] "RemoveContainer" containerID="840533b00ccbf019c1b1ad45d63acac9dc28cdd233141ef50008afcdc65278f7" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.959258 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-82j27"] Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.963872 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-82j27"] Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.971957 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbmf6"] Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.978498 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fbmf6"] Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.979019 4776 scope.go:117] "RemoveContainer" containerID="dcd98ab358772c6b5be8ed5aa6d89ab06dee840684094ed9ee2af6352362d64a" Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.983884 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jf2nl"] Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.988323 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jf2nl"] Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.993965 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z2qsw"] Dec 08 09:05:48 crc kubenswrapper[4776]: I1208 09:05:48.997444 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z2qsw"] Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.006155 4776 scope.go:117] "RemoveContainer" containerID="e8e8a636749d16442469a262b654d77373c822f9ce11dff6319fa5b27aec9a27" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.024236 4776 scope.go:117] "RemoveContainer" containerID="840533b00ccbf019c1b1ad45d63acac9dc28cdd233141ef50008afcdc65278f7" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.024765 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"840533b00ccbf019c1b1ad45d63acac9dc28cdd233141ef50008afcdc65278f7\": container with ID starting with 840533b00ccbf019c1b1ad45d63acac9dc28cdd233141ef50008afcdc65278f7 not found: ID does not exist" containerID="840533b00ccbf019c1b1ad45d63acac9dc28cdd233141ef50008afcdc65278f7" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.024831 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840533b00ccbf019c1b1ad45d63acac9dc28cdd233141ef50008afcdc65278f7"} err="failed to get container status \"840533b00ccbf019c1b1ad45d63acac9dc28cdd233141ef50008afcdc65278f7\": rpc error: code = NotFound desc = could not find container \"840533b00ccbf019c1b1ad45d63acac9dc28cdd233141ef50008afcdc65278f7\": container with ID starting with 840533b00ccbf019c1b1ad45d63acac9dc28cdd233141ef50008afcdc65278f7 not found: ID does not exist" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.024892 4776 scope.go:117] "RemoveContainer" containerID="dcd98ab358772c6b5be8ed5aa6d89ab06dee840684094ed9ee2af6352362d64a" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.025391 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcd98ab358772c6b5be8ed5aa6d89ab06dee840684094ed9ee2af6352362d64a\": container with ID starting with dcd98ab358772c6b5be8ed5aa6d89ab06dee840684094ed9ee2af6352362d64a not found: ID does not exist" containerID="dcd98ab358772c6b5be8ed5aa6d89ab06dee840684094ed9ee2af6352362d64a" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.025425 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd98ab358772c6b5be8ed5aa6d89ab06dee840684094ed9ee2af6352362d64a"} err="failed to get container status \"dcd98ab358772c6b5be8ed5aa6d89ab06dee840684094ed9ee2af6352362d64a\": rpc error: code = NotFound desc = could not find container \"dcd98ab358772c6b5be8ed5aa6d89ab06dee840684094ed9ee2af6352362d64a\": container with ID starting with dcd98ab358772c6b5be8ed5aa6d89ab06dee840684094ed9ee2af6352362d64a not found: ID does not exist" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.025451 4776 scope.go:117] "RemoveContainer" containerID="e8e8a636749d16442469a262b654d77373c822f9ce11dff6319fa5b27aec9a27" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.025744 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8e8a636749d16442469a262b654d77373c822f9ce11dff6319fa5b27aec9a27\": container with ID starting with e8e8a636749d16442469a262b654d77373c822f9ce11dff6319fa5b27aec9a27 not found: ID does not exist" containerID="e8e8a636749d16442469a262b654d77373c822f9ce11dff6319fa5b27aec9a27" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.025779 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8e8a636749d16442469a262b654d77373c822f9ce11dff6319fa5b27aec9a27"} err="failed to get container status \"e8e8a636749d16442469a262b654d77373c822f9ce11dff6319fa5b27aec9a27\": rpc error: code = NotFound desc = could not find container \"e8e8a636749d16442469a262b654d77373c822f9ce11dff6319fa5b27aec9a27\": container with ID starting with e8e8a636749d16442469a262b654d77373c822f9ce11dff6319fa5b27aec9a27 not found: ID does not exist" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.025801 4776 scope.go:117] "RemoveContainer" containerID="726cc14fd07cdccd06b06e7db795eb13bf7b9ce8069a7c9f2f0bcdefd1c5a77c" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.042018 4776 scope.go:117] "RemoveContainer" containerID="726cc14fd07cdccd06b06e7db795eb13bf7b9ce8069a7c9f2f0bcdefd1c5a77c" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.042520 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"726cc14fd07cdccd06b06e7db795eb13bf7b9ce8069a7c9f2f0bcdefd1c5a77c\": container with ID starting with 726cc14fd07cdccd06b06e7db795eb13bf7b9ce8069a7c9f2f0bcdefd1c5a77c not found: ID does not exist" containerID="726cc14fd07cdccd06b06e7db795eb13bf7b9ce8069a7c9f2f0bcdefd1c5a77c" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.042569 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"726cc14fd07cdccd06b06e7db795eb13bf7b9ce8069a7c9f2f0bcdefd1c5a77c"} err="failed to get container status \"726cc14fd07cdccd06b06e7db795eb13bf7b9ce8069a7c9f2f0bcdefd1c5a77c\": rpc error: code = NotFound desc = could not find container \"726cc14fd07cdccd06b06e7db795eb13bf7b9ce8069a7c9f2f0bcdefd1c5a77c\": container with ID starting with 726cc14fd07cdccd06b06e7db795eb13bf7b9ce8069a7c9f2f0bcdefd1c5a77c not found: ID does not exist" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.042608 4776 scope.go:117] "RemoveContainer" containerID="920717a9fa8482ea3eb51645c7938b76c2f77c2ff318e53fd9fa1d5fcc81ff83" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.055861 4776 scope.go:117] "RemoveContainer" containerID="7e3e905b36fd827d475beae9e48acd14ce7d0d60978ee57c529179a7bad0646d" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.067872 4776 scope.go:117] "RemoveContainer" containerID="7f4b22d9011a80a4e8ad069237a38f0c65307ae8d30af7804a1f3569464b229d" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.079940 4776 scope.go:117] "RemoveContainer" containerID="920717a9fa8482ea3eb51645c7938b76c2f77c2ff318e53fd9fa1d5fcc81ff83" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.080359 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"920717a9fa8482ea3eb51645c7938b76c2f77c2ff318e53fd9fa1d5fcc81ff83\": container with ID starting with 920717a9fa8482ea3eb51645c7938b76c2f77c2ff318e53fd9fa1d5fcc81ff83 not found: ID does not exist" containerID="920717a9fa8482ea3eb51645c7938b76c2f77c2ff318e53fd9fa1d5fcc81ff83" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.080401 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"920717a9fa8482ea3eb51645c7938b76c2f77c2ff318e53fd9fa1d5fcc81ff83"} err="failed to get container status \"920717a9fa8482ea3eb51645c7938b76c2f77c2ff318e53fd9fa1d5fcc81ff83\": rpc error: code = NotFound desc = could not find container \"920717a9fa8482ea3eb51645c7938b76c2f77c2ff318e53fd9fa1d5fcc81ff83\": container with ID starting with 920717a9fa8482ea3eb51645c7938b76c2f77c2ff318e53fd9fa1d5fcc81ff83 not found: ID does not exist" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.080428 4776 scope.go:117] "RemoveContainer" containerID="7e3e905b36fd827d475beae9e48acd14ce7d0d60978ee57c529179a7bad0646d" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.080754 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e3e905b36fd827d475beae9e48acd14ce7d0d60978ee57c529179a7bad0646d\": container with ID starting with 7e3e905b36fd827d475beae9e48acd14ce7d0d60978ee57c529179a7bad0646d not found: ID does not exist" containerID="7e3e905b36fd827d475beae9e48acd14ce7d0d60978ee57c529179a7bad0646d" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.080786 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3e905b36fd827d475beae9e48acd14ce7d0d60978ee57c529179a7bad0646d"} err="failed to get container status \"7e3e905b36fd827d475beae9e48acd14ce7d0d60978ee57c529179a7bad0646d\": rpc error: code = NotFound desc = could not find container \"7e3e905b36fd827d475beae9e48acd14ce7d0d60978ee57c529179a7bad0646d\": container with ID starting with 7e3e905b36fd827d475beae9e48acd14ce7d0d60978ee57c529179a7bad0646d not found: ID does not exist" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.080805 4776 scope.go:117] "RemoveContainer" containerID="7f4b22d9011a80a4e8ad069237a38f0c65307ae8d30af7804a1f3569464b229d" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.081078 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f4b22d9011a80a4e8ad069237a38f0c65307ae8d30af7804a1f3569464b229d\": container with ID starting with 7f4b22d9011a80a4e8ad069237a38f0c65307ae8d30af7804a1f3569464b229d not found: ID does not exist" containerID="7f4b22d9011a80a4e8ad069237a38f0c65307ae8d30af7804a1f3569464b229d" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.081117 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f4b22d9011a80a4e8ad069237a38f0c65307ae8d30af7804a1f3569464b229d"} err="failed to get container status \"7f4b22d9011a80a4e8ad069237a38f0c65307ae8d30af7804a1f3569464b229d\": rpc error: code = NotFound desc = could not find container \"7f4b22d9011a80a4e8ad069237a38f0c65307ae8d30af7804a1f3569464b229d\": container with ID starting with 7f4b22d9011a80a4e8ad069237a38f0c65307ae8d30af7804a1f3569464b229d not found: ID does not exist" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.081146 4776 scope.go:117] "RemoveContainer" containerID="9636a56e89e398e2143e9da6b6d1d18b4704c2e92b280289711701f9f8cf8881" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.094020 4776 scope.go:117] "RemoveContainer" containerID="bf0e289111ef12669e0e64d39ade41f882e72a0190c4ea452586a7c9eeb711cb" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.109634 4776 scope.go:117] "RemoveContainer" containerID="b76859968451e9c9aeb9184231ad6cbb61121a8dbd4a0b425070c0da82e3a55a" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.122768 4776 scope.go:117] "RemoveContainer" containerID="9636a56e89e398e2143e9da6b6d1d18b4704c2e92b280289711701f9f8cf8881" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.123257 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9636a56e89e398e2143e9da6b6d1d18b4704c2e92b280289711701f9f8cf8881\": container with ID starting with 9636a56e89e398e2143e9da6b6d1d18b4704c2e92b280289711701f9f8cf8881 not found: ID does not exist" containerID="9636a56e89e398e2143e9da6b6d1d18b4704c2e92b280289711701f9f8cf8881" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.123294 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9636a56e89e398e2143e9da6b6d1d18b4704c2e92b280289711701f9f8cf8881"} err="failed to get container status \"9636a56e89e398e2143e9da6b6d1d18b4704c2e92b280289711701f9f8cf8881\": rpc error: code = NotFound desc = could not find container \"9636a56e89e398e2143e9da6b6d1d18b4704c2e92b280289711701f9f8cf8881\": container with ID starting with 9636a56e89e398e2143e9da6b6d1d18b4704c2e92b280289711701f9f8cf8881 not found: ID does not exist" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.123322 4776 scope.go:117] "RemoveContainer" containerID="bf0e289111ef12669e0e64d39ade41f882e72a0190c4ea452586a7c9eeb711cb" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.123667 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf0e289111ef12669e0e64d39ade41f882e72a0190c4ea452586a7c9eeb711cb\": container with ID starting with bf0e289111ef12669e0e64d39ade41f882e72a0190c4ea452586a7c9eeb711cb not found: ID does not exist" containerID="bf0e289111ef12669e0e64d39ade41f882e72a0190c4ea452586a7c9eeb711cb" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.123685 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf0e289111ef12669e0e64d39ade41f882e72a0190c4ea452586a7c9eeb711cb"} err="failed to get container status \"bf0e289111ef12669e0e64d39ade41f882e72a0190c4ea452586a7c9eeb711cb\": rpc error: code = NotFound desc = could not find container \"bf0e289111ef12669e0e64d39ade41f882e72a0190c4ea452586a7c9eeb711cb\": container with ID starting with bf0e289111ef12669e0e64d39ade41f882e72a0190c4ea452586a7c9eeb711cb not found: ID does not exist" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.123697 4776 scope.go:117] "RemoveContainer" containerID="b76859968451e9c9aeb9184231ad6cbb61121a8dbd4a0b425070c0da82e3a55a" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.124023 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b76859968451e9c9aeb9184231ad6cbb61121a8dbd4a0b425070c0da82e3a55a\": container with ID starting with b76859968451e9c9aeb9184231ad6cbb61121a8dbd4a0b425070c0da82e3a55a not found: ID does not exist" containerID="b76859968451e9c9aeb9184231ad6cbb61121a8dbd4a0b425070c0da82e3a55a" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.124041 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b76859968451e9c9aeb9184231ad6cbb61121a8dbd4a0b425070c0da82e3a55a"} err="failed to get container status \"b76859968451e9c9aeb9184231ad6cbb61121a8dbd4a0b425070c0da82e3a55a\": rpc error: code = NotFound desc = could not find container \"b76859968451e9c9aeb9184231ad6cbb61121a8dbd4a0b425070c0da82e3a55a\": container with ID starting with b76859968451e9c9aeb9184231ad6cbb61121a8dbd4a0b425070c0da82e3a55a not found: ID does not exist" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.488527 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xbmh8"] Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.489019 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" containerName="extract-content" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489052 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" containerName="extract-content" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.489064 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" containerName="extract-utilities" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489070 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" containerName="extract-utilities" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.489078 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a339c1-f955-4a08-bba1-0df39a886324" containerName="registry-server" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489085 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a339c1-f955-4a08-bba1-0df39a886324" containerName="registry-server" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.489095 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" containerName="registry-server" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489100 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" containerName="registry-server" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.489112 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a339c1-f955-4a08-bba1-0df39a886324" containerName="extract-content" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489118 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a339c1-f955-4a08-bba1-0df39a886324" containerName="extract-content" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.489127 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" containerName="extract-content" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489133 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" containerName="extract-content" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.489143 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde03b49-eb1e-4941-b49e-e361cb8d83f4" containerName="extract-utilities" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489148 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde03b49-eb1e-4941-b49e-e361cb8d83f4" containerName="extract-utilities" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.489158 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e1cbc5-727f-42ca-881c-fdd0b07ca739" containerName="marketplace-operator" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489164 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e1cbc5-727f-42ca-881c-fdd0b07ca739" containerName="marketplace-operator" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.489188 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a339c1-f955-4a08-bba1-0df39a886324" containerName="extract-utilities" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489194 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a339c1-f955-4a08-bba1-0df39a886324" containerName="extract-utilities" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.489203 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" containerName="extract-utilities" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489209 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" containerName="extract-utilities" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.489216 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" containerName="registry-server" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489222 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" containerName="registry-server" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.489230 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde03b49-eb1e-4941-b49e-e361cb8d83f4" containerName="registry-server" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489235 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde03b49-eb1e-4941-b49e-e361cb8d83f4" containerName="registry-server" Dec 08 09:05:49 crc kubenswrapper[4776]: E1208 09:05:49.489244 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde03b49-eb1e-4941-b49e-e361cb8d83f4" containerName="extract-content" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489249 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde03b49-eb1e-4941-b49e-e361cb8d83f4" containerName="extract-content" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489329 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" containerName="registry-server" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489340 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6a339c1-f955-4a08-bba1-0df39a886324" containerName="registry-server" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489351 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" containerName="registry-server" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489359 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde03b49-eb1e-4941-b49e-e361cb8d83f4" containerName="registry-server" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.489370 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="50e1cbc5-727f-42ca-881c-fdd0b07ca739" containerName="marketplace-operator" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.490024 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xbmh8" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.492397 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.502608 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xbmh8"] Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.531661 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhkrh\" (UniqueName: \"kubernetes.io/projected/d01dc6cb-3ab5-494a-b89f-63e94c2e91ee-kube-api-access-fhkrh\") pod \"community-operators-xbmh8\" (UID: \"d01dc6cb-3ab5-494a-b89f-63e94c2e91ee\") " pod="openshift-marketplace/community-operators-xbmh8" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.531930 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d01dc6cb-3ab5-494a-b89f-63e94c2e91ee-utilities\") pod \"community-operators-xbmh8\" (UID: \"d01dc6cb-3ab5-494a-b89f-63e94c2e91ee\") " pod="openshift-marketplace/community-operators-xbmh8" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.532061 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d01dc6cb-3ab5-494a-b89f-63e94c2e91ee-catalog-content\") pod \"community-operators-xbmh8\" (UID: \"d01dc6cb-3ab5-494a-b89f-63e94c2e91ee\") " pod="openshift-marketplace/community-operators-xbmh8" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.633675 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d01dc6cb-3ab5-494a-b89f-63e94c2e91ee-catalog-content\") pod \"community-operators-xbmh8\" (UID: \"d01dc6cb-3ab5-494a-b89f-63e94c2e91ee\") " pod="openshift-marketplace/community-operators-xbmh8" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.633751 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhkrh\" (UniqueName: \"kubernetes.io/projected/d01dc6cb-3ab5-494a-b89f-63e94c2e91ee-kube-api-access-fhkrh\") pod \"community-operators-xbmh8\" (UID: \"d01dc6cb-3ab5-494a-b89f-63e94c2e91ee\") " pod="openshift-marketplace/community-operators-xbmh8" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.633785 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d01dc6cb-3ab5-494a-b89f-63e94c2e91ee-utilities\") pod \"community-operators-xbmh8\" (UID: \"d01dc6cb-3ab5-494a-b89f-63e94c2e91ee\") " pod="openshift-marketplace/community-operators-xbmh8" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.634514 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d01dc6cb-3ab5-494a-b89f-63e94c2e91ee-utilities\") pod \"community-operators-xbmh8\" (UID: \"d01dc6cb-3ab5-494a-b89f-63e94c2e91ee\") " pod="openshift-marketplace/community-operators-xbmh8" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.634604 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d01dc6cb-3ab5-494a-b89f-63e94c2e91ee-catalog-content\") pod \"community-operators-xbmh8\" (UID: \"d01dc6cb-3ab5-494a-b89f-63e94c2e91ee\") " pod="openshift-marketplace/community-operators-xbmh8" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.660274 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhkrh\" (UniqueName: \"kubernetes.io/projected/d01dc6cb-3ab5-494a-b89f-63e94c2e91ee-kube-api-access-fhkrh\") pod \"community-operators-xbmh8\" (UID: \"d01dc6cb-3ab5-494a-b89f-63e94c2e91ee\") " pod="openshift-marketplace/community-operators-xbmh8" Dec 08 09:05:49 crc kubenswrapper[4776]: I1208 09:05:49.806284 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xbmh8" Dec 08 09:05:50 crc kubenswrapper[4776]: I1208 09:05:50.200502 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xbmh8"] Dec 08 09:05:50 crc kubenswrapper[4776]: I1208 09:05:50.357953 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9" path="/var/lib/kubelet/pods/0f76ec41-a3f1-4fdc-96d1-1a1b2dd5f4f9/volumes" Dec 08 09:05:50 crc kubenswrapper[4776]: I1208 09:05:50.358677 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50e1cbc5-727f-42ca-881c-fdd0b07ca739" path="/var/lib/kubelet/pods/50e1cbc5-727f-42ca-881c-fdd0b07ca739/volumes" Dec 08 09:05:50 crc kubenswrapper[4776]: I1208 09:05:50.359146 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f1cf0fc-eed0-4fba-8b89-b29bf78cadac" path="/var/lib/kubelet/pods/9f1cf0fc-eed0-4fba-8b89-b29bf78cadac/volumes" Dec 08 09:05:50 crc kubenswrapper[4776]: I1208 09:05:50.360280 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6a339c1-f955-4a08-bba1-0df39a886324" path="/var/lib/kubelet/pods/b6a339c1-f955-4a08-bba1-0df39a886324/volumes" Dec 08 09:05:50 crc kubenswrapper[4776]: I1208 09:05:50.360902 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde03b49-eb1e-4941-b49e-e361cb8d83f4" path="/var/lib/kubelet/pods/bde03b49-eb1e-4941-b49e-e361cb8d83f4/volumes" Dec 08 09:05:50 crc kubenswrapper[4776]: I1208 09:05:50.938215 4776 generic.go:334] "Generic (PLEG): container finished" podID="d01dc6cb-3ab5-494a-b89f-63e94c2e91ee" containerID="910d5eceea73e1b8d8c737529d53ed5ddfad72fc6af7d9271bf8a83a8783f6b9" exitCode=0 Dec 08 09:05:50 crc kubenswrapper[4776]: I1208 09:05:50.938288 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xbmh8" event={"ID":"d01dc6cb-3ab5-494a-b89f-63e94c2e91ee","Type":"ContainerDied","Data":"910d5eceea73e1b8d8c737529d53ed5ddfad72fc6af7d9271bf8a83a8783f6b9"} Dec 08 09:05:50 crc kubenswrapper[4776]: I1208 09:05:50.938342 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xbmh8" event={"ID":"d01dc6cb-3ab5-494a-b89f-63e94c2e91ee","Type":"ContainerStarted","Data":"f0effd35d0fc11366812e09a2d53ac6e8b076b33f7a86dfd5b7cce738833365a"} Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.288138 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-txmws"] Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.289330 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txmws" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.291944 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.300425 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-txmws"] Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.361044 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeec433a-2b07-4008-9829-d266f85b5cf1-utilities\") pod \"redhat-marketplace-txmws\" (UID: \"aeec433a-2b07-4008-9829-d266f85b5cf1\") " pod="openshift-marketplace/redhat-marketplace-txmws" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.361404 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeec433a-2b07-4008-9829-d266f85b5cf1-catalog-content\") pod \"redhat-marketplace-txmws\" (UID: \"aeec433a-2b07-4008-9829-d266f85b5cf1\") " pod="openshift-marketplace/redhat-marketplace-txmws" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.361438 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpdfm\" (UniqueName: \"kubernetes.io/projected/aeec433a-2b07-4008-9829-d266f85b5cf1-kube-api-access-hpdfm\") pod \"redhat-marketplace-txmws\" (UID: \"aeec433a-2b07-4008-9829-d266f85b5cf1\") " pod="openshift-marketplace/redhat-marketplace-txmws" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.463262 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeec433a-2b07-4008-9829-d266f85b5cf1-utilities\") pod \"redhat-marketplace-txmws\" (UID: \"aeec433a-2b07-4008-9829-d266f85b5cf1\") " pod="openshift-marketplace/redhat-marketplace-txmws" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.463296 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeec433a-2b07-4008-9829-d266f85b5cf1-catalog-content\") pod \"redhat-marketplace-txmws\" (UID: \"aeec433a-2b07-4008-9829-d266f85b5cf1\") " pod="openshift-marketplace/redhat-marketplace-txmws" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.463323 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpdfm\" (UniqueName: \"kubernetes.io/projected/aeec433a-2b07-4008-9829-d266f85b5cf1-kube-api-access-hpdfm\") pod \"redhat-marketplace-txmws\" (UID: \"aeec433a-2b07-4008-9829-d266f85b5cf1\") " pod="openshift-marketplace/redhat-marketplace-txmws" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.463744 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeec433a-2b07-4008-9829-d266f85b5cf1-utilities\") pod \"redhat-marketplace-txmws\" (UID: \"aeec433a-2b07-4008-9829-d266f85b5cf1\") " pod="openshift-marketplace/redhat-marketplace-txmws" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.464074 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeec433a-2b07-4008-9829-d266f85b5cf1-catalog-content\") pod \"redhat-marketplace-txmws\" (UID: \"aeec433a-2b07-4008-9829-d266f85b5cf1\") " pod="openshift-marketplace/redhat-marketplace-txmws" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.486557 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpdfm\" (UniqueName: \"kubernetes.io/projected/aeec433a-2b07-4008-9829-d266f85b5cf1-kube-api-access-hpdfm\") pod \"redhat-marketplace-txmws\" (UID: \"aeec433a-2b07-4008-9829-d266f85b5cf1\") " pod="openshift-marketplace/redhat-marketplace-txmws" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.605532 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txmws" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.887654 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c4zt9"] Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.889188 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.891912 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.909281 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c4zt9"] Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.969278 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30120f8b-a384-40f4-9211-ba3d8b3154f0-catalog-content\") pod \"redhat-operators-c4zt9\" (UID: \"30120f8b-a384-40f4-9211-ba3d8b3154f0\") " pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.969376 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx9dt\" (UniqueName: \"kubernetes.io/projected/30120f8b-a384-40f4-9211-ba3d8b3154f0-kube-api-access-xx9dt\") pod \"redhat-operators-c4zt9\" (UID: \"30120f8b-a384-40f4-9211-ba3d8b3154f0\") " pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.969445 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30120f8b-a384-40f4-9211-ba3d8b3154f0-utilities\") pod \"redhat-operators-c4zt9\" (UID: \"30120f8b-a384-40f4-9211-ba3d8b3154f0\") " pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:05:51 crc kubenswrapper[4776]: I1208 09:05:51.982140 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-txmws"] Dec 08 09:05:51 crc kubenswrapper[4776]: W1208 09:05:51.988763 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaeec433a_2b07_4008_9829_d266f85b5cf1.slice/crio-0f93c1719c85dcc242c74159d600cafea9c268a3533e70b86a68719cdad1dc78 WatchSource:0}: Error finding container 0f93c1719c85dcc242c74159d600cafea9c268a3533e70b86a68719cdad1dc78: Status 404 returned error can't find the container with id 0f93c1719c85dcc242c74159d600cafea9c268a3533e70b86a68719cdad1dc78 Dec 08 09:05:52 crc kubenswrapper[4776]: I1208 09:05:52.070954 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30120f8b-a384-40f4-9211-ba3d8b3154f0-catalog-content\") pod \"redhat-operators-c4zt9\" (UID: \"30120f8b-a384-40f4-9211-ba3d8b3154f0\") " pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:05:52 crc kubenswrapper[4776]: I1208 09:05:52.071000 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx9dt\" (UniqueName: \"kubernetes.io/projected/30120f8b-a384-40f4-9211-ba3d8b3154f0-kube-api-access-xx9dt\") pod \"redhat-operators-c4zt9\" (UID: \"30120f8b-a384-40f4-9211-ba3d8b3154f0\") " pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:05:52 crc kubenswrapper[4776]: I1208 09:05:52.071028 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30120f8b-a384-40f4-9211-ba3d8b3154f0-utilities\") pod \"redhat-operators-c4zt9\" (UID: \"30120f8b-a384-40f4-9211-ba3d8b3154f0\") " pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:05:52 crc kubenswrapper[4776]: I1208 09:05:52.071463 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30120f8b-a384-40f4-9211-ba3d8b3154f0-catalog-content\") pod \"redhat-operators-c4zt9\" (UID: \"30120f8b-a384-40f4-9211-ba3d8b3154f0\") " pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:05:52 crc kubenswrapper[4776]: I1208 09:05:52.071469 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30120f8b-a384-40f4-9211-ba3d8b3154f0-utilities\") pod \"redhat-operators-c4zt9\" (UID: \"30120f8b-a384-40f4-9211-ba3d8b3154f0\") " pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:05:52 crc kubenswrapper[4776]: I1208 09:05:52.091142 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx9dt\" (UniqueName: \"kubernetes.io/projected/30120f8b-a384-40f4-9211-ba3d8b3154f0-kube-api-access-xx9dt\") pod \"redhat-operators-c4zt9\" (UID: \"30120f8b-a384-40f4-9211-ba3d8b3154f0\") " pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:05:52 crc kubenswrapper[4776]: I1208 09:05:52.207626 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:05:52 crc kubenswrapper[4776]: I1208 09:05:52.571002 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c4zt9"] Dec 08 09:05:52 crc kubenswrapper[4776]: W1208 09:05:52.578363 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30120f8b_a384_40f4_9211_ba3d8b3154f0.slice/crio-44cdef27d42388ce65767f113bc5ed355de2283b2ed98299f78df76829d54677 WatchSource:0}: Error finding container 44cdef27d42388ce65767f113bc5ed355de2283b2ed98299f78df76829d54677: Status 404 returned error can't find the container with id 44cdef27d42388ce65767f113bc5ed355de2283b2ed98299f78df76829d54677 Dec 08 09:05:52 crc kubenswrapper[4776]: I1208 09:05:52.947863 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4zt9" event={"ID":"30120f8b-a384-40f4-9211-ba3d8b3154f0","Type":"ContainerStarted","Data":"44cdef27d42388ce65767f113bc5ed355de2283b2ed98299f78df76829d54677"} Dec 08 09:05:52 crc kubenswrapper[4776]: I1208 09:05:52.949395 4776 generic.go:334] "Generic (PLEG): container finished" podID="aeec433a-2b07-4008-9829-d266f85b5cf1" containerID="10c24edbd96b1211e8f971e6396eb7173997514dd169efdd0237347d889d24d1" exitCode=0 Dec 08 09:05:52 crc kubenswrapper[4776]: I1208 09:05:52.949434 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txmws" event={"ID":"aeec433a-2b07-4008-9829-d266f85b5cf1","Type":"ContainerDied","Data":"10c24edbd96b1211e8f971e6396eb7173997514dd169efdd0237347d889d24d1"} Dec 08 09:05:52 crc kubenswrapper[4776]: I1208 09:05:52.949482 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txmws" event={"ID":"aeec433a-2b07-4008-9829-d266f85b5cf1","Type":"ContainerStarted","Data":"0f93c1719c85dcc242c74159d600cafea9c268a3533e70b86a68719cdad1dc78"} Dec 08 09:05:52 crc kubenswrapper[4776]: I1208 09:05:52.951138 4776 generic.go:334] "Generic (PLEG): container finished" podID="d01dc6cb-3ab5-494a-b89f-63e94c2e91ee" containerID="dc9abc793cabd6cadda8e42f2c35bc8256ba605e5330c9a7a55353258b938258" exitCode=0 Dec 08 09:05:52 crc kubenswrapper[4776]: I1208 09:05:52.951197 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xbmh8" event={"ID":"d01dc6cb-3ab5-494a-b89f-63e94c2e91ee","Type":"ContainerDied","Data":"dc9abc793cabd6cadda8e42f2c35bc8256ba605e5330c9a7a55353258b938258"} Dec 08 09:05:53 crc kubenswrapper[4776]: I1208 09:05:53.694740 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dqj6b"] Dec 08 09:05:53 crc kubenswrapper[4776]: I1208 09:05:53.696539 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqj6b" Dec 08 09:05:53 crc kubenswrapper[4776]: I1208 09:05:53.700872 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 08 09:05:53 crc kubenswrapper[4776]: I1208 09:05:53.707052 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dqj6b"] Dec 08 09:05:53 crc kubenswrapper[4776]: I1208 09:05:53.789627 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03a186d8-ec36-41ef-b882-f0cba34a0913-utilities\") pod \"certified-operators-dqj6b\" (UID: \"03a186d8-ec36-41ef-b882-f0cba34a0913\") " pod="openshift-marketplace/certified-operators-dqj6b" Dec 08 09:05:53 crc kubenswrapper[4776]: I1208 09:05:53.789709 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngtpl\" (UniqueName: \"kubernetes.io/projected/03a186d8-ec36-41ef-b882-f0cba34a0913-kube-api-access-ngtpl\") pod \"certified-operators-dqj6b\" (UID: \"03a186d8-ec36-41ef-b882-f0cba34a0913\") " pod="openshift-marketplace/certified-operators-dqj6b" Dec 08 09:05:53 crc kubenswrapper[4776]: I1208 09:05:53.789776 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03a186d8-ec36-41ef-b882-f0cba34a0913-catalog-content\") pod \"certified-operators-dqj6b\" (UID: \"03a186d8-ec36-41ef-b882-f0cba34a0913\") " pod="openshift-marketplace/certified-operators-dqj6b" Dec 08 09:05:53 crc kubenswrapper[4776]: I1208 09:05:53.890560 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngtpl\" (UniqueName: \"kubernetes.io/projected/03a186d8-ec36-41ef-b882-f0cba34a0913-kube-api-access-ngtpl\") pod \"certified-operators-dqj6b\" (UID: \"03a186d8-ec36-41ef-b882-f0cba34a0913\") " pod="openshift-marketplace/certified-operators-dqj6b" Dec 08 09:05:53 crc kubenswrapper[4776]: I1208 09:05:53.890619 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03a186d8-ec36-41ef-b882-f0cba34a0913-catalog-content\") pod \"certified-operators-dqj6b\" (UID: \"03a186d8-ec36-41ef-b882-f0cba34a0913\") " pod="openshift-marketplace/certified-operators-dqj6b" Dec 08 09:05:53 crc kubenswrapper[4776]: I1208 09:05:53.890667 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03a186d8-ec36-41ef-b882-f0cba34a0913-utilities\") pod \"certified-operators-dqj6b\" (UID: \"03a186d8-ec36-41ef-b882-f0cba34a0913\") " pod="openshift-marketplace/certified-operators-dqj6b" Dec 08 09:05:53 crc kubenswrapper[4776]: I1208 09:05:53.891078 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03a186d8-ec36-41ef-b882-f0cba34a0913-utilities\") pod \"certified-operators-dqj6b\" (UID: \"03a186d8-ec36-41ef-b882-f0cba34a0913\") " pod="openshift-marketplace/certified-operators-dqj6b" Dec 08 09:05:53 crc kubenswrapper[4776]: I1208 09:05:53.891168 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03a186d8-ec36-41ef-b882-f0cba34a0913-catalog-content\") pod \"certified-operators-dqj6b\" (UID: \"03a186d8-ec36-41ef-b882-f0cba34a0913\") " pod="openshift-marketplace/certified-operators-dqj6b" Dec 08 09:05:53 crc kubenswrapper[4776]: I1208 09:05:53.910028 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngtpl\" (UniqueName: \"kubernetes.io/projected/03a186d8-ec36-41ef-b882-f0cba34a0913-kube-api-access-ngtpl\") pod \"certified-operators-dqj6b\" (UID: \"03a186d8-ec36-41ef-b882-f0cba34a0913\") " pod="openshift-marketplace/certified-operators-dqj6b" Dec 08 09:05:53 crc kubenswrapper[4776]: I1208 09:05:53.958148 4776 generic.go:334] "Generic (PLEG): container finished" podID="30120f8b-a384-40f4-9211-ba3d8b3154f0" containerID="aae7f6b2532aadc449b85d4c38c7b97e85329898aeab7778a386cfdb502c04ab" exitCode=0 Dec 08 09:05:53 crc kubenswrapper[4776]: I1208 09:05:53.958223 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4zt9" event={"ID":"30120f8b-a384-40f4-9211-ba3d8b3154f0","Type":"ContainerDied","Data":"aae7f6b2532aadc449b85d4c38c7b97e85329898aeab7778a386cfdb502c04ab"} Dec 08 09:05:54 crc kubenswrapper[4776]: I1208 09:05:54.020412 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqj6b" Dec 08 09:05:54 crc kubenswrapper[4776]: I1208 09:05:54.394768 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dqj6b"] Dec 08 09:05:54 crc kubenswrapper[4776]: W1208 09:05:54.417791 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03a186d8_ec36_41ef_b882_f0cba34a0913.slice/crio-9adf10d7d4268f7a8b4ee50f95266e820a86f38743880d1e6e77e2571c3bd14e WatchSource:0}: Error finding container 9adf10d7d4268f7a8b4ee50f95266e820a86f38743880d1e6e77e2571c3bd14e: Status 404 returned error can't find the container with id 9adf10d7d4268f7a8b4ee50f95266e820a86f38743880d1e6e77e2571c3bd14e Dec 08 09:05:54 crc kubenswrapper[4776]: I1208 09:05:54.964978 4776 generic.go:334] "Generic (PLEG): container finished" podID="03a186d8-ec36-41ef-b882-f0cba34a0913" containerID="313161108e191ce2e8aed85757e2458285cb8e5074e61da5f64e83346247d127" exitCode=0 Dec 08 09:05:54 crc kubenswrapper[4776]: I1208 09:05:54.965240 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqj6b" event={"ID":"03a186d8-ec36-41ef-b882-f0cba34a0913","Type":"ContainerDied","Data":"313161108e191ce2e8aed85757e2458285cb8e5074e61da5f64e83346247d127"} Dec 08 09:05:54 crc kubenswrapper[4776]: I1208 09:05:54.965311 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqj6b" event={"ID":"03a186d8-ec36-41ef-b882-f0cba34a0913","Type":"ContainerStarted","Data":"9adf10d7d4268f7a8b4ee50f95266e820a86f38743880d1e6e77e2571c3bd14e"} Dec 08 09:05:55 crc kubenswrapper[4776]: I1208 09:05:55.974758 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txmws" event={"ID":"aeec433a-2b07-4008-9829-d266f85b5cf1","Type":"ContainerStarted","Data":"bf5136e5d075ad1c242d7f25f88d3463dbc4e71b44b55de5b1a6c142d012f980"} Dec 08 09:05:55 crc kubenswrapper[4776]: I1208 09:05:55.981215 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xbmh8" event={"ID":"d01dc6cb-3ab5-494a-b89f-63e94c2e91ee","Type":"ContainerStarted","Data":"c669acd7367bd227ea35c5e3769f9435961430df63063ba1e70ab0e7d3f24955"} Dec 08 09:05:55 crc kubenswrapper[4776]: I1208 09:05:55.983580 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4zt9" event={"ID":"30120f8b-a384-40f4-9211-ba3d8b3154f0","Type":"ContainerStarted","Data":"8559708363f99af1440c9831e86bb5db9591cbe956ef5e50280734a830e99d11"} Dec 08 09:05:56 crc kubenswrapper[4776]: I1208 09:05:56.035926 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xbmh8" podStartSLOduration=3.041645059 podStartE2EDuration="7.035903635s" podCreationTimestamp="2025-12-08 09:05:49 +0000 UTC" firstStartedPulling="2025-12-08 09:05:50.93996874 +0000 UTC m=+427.203193762" lastFinishedPulling="2025-12-08 09:05:54.934227316 +0000 UTC m=+431.197452338" observedRunningTime="2025-12-08 09:05:56.013732628 +0000 UTC m=+432.276957650" watchObservedRunningTime="2025-12-08 09:05:56.035903635 +0000 UTC m=+432.299128657" Dec 08 09:05:56 crc kubenswrapper[4776]: I1208 09:05:56.991230 4776 generic.go:334] "Generic (PLEG): container finished" podID="aeec433a-2b07-4008-9829-d266f85b5cf1" containerID="bf5136e5d075ad1c242d7f25f88d3463dbc4e71b44b55de5b1a6c142d012f980" exitCode=0 Dec 08 09:05:56 crc kubenswrapper[4776]: I1208 09:05:56.991330 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txmws" event={"ID":"aeec433a-2b07-4008-9829-d266f85b5cf1","Type":"ContainerDied","Data":"bf5136e5d075ad1c242d7f25f88d3463dbc4e71b44b55de5b1a6c142d012f980"} Dec 08 09:05:56 crc kubenswrapper[4776]: I1208 09:05:56.995304 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqj6b" event={"ID":"03a186d8-ec36-41ef-b882-f0cba34a0913","Type":"ContainerStarted","Data":"4d53b7692b4000e41b46393271cc7a25c752f7158fc6a8aa7fca5d4701d5aa55"} Dec 08 09:05:58 crc kubenswrapper[4776]: I1208 09:05:58.001219 4776 generic.go:334] "Generic (PLEG): container finished" podID="30120f8b-a384-40f4-9211-ba3d8b3154f0" containerID="8559708363f99af1440c9831e86bb5db9591cbe956ef5e50280734a830e99d11" exitCode=0 Dec 08 09:05:58 crc kubenswrapper[4776]: I1208 09:05:58.001301 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4zt9" event={"ID":"30120f8b-a384-40f4-9211-ba3d8b3154f0","Type":"ContainerDied","Data":"8559708363f99af1440c9831e86bb5db9591cbe956ef5e50280734a830e99d11"} Dec 08 09:05:58 crc kubenswrapper[4776]: I1208 09:05:58.004449 4776 generic.go:334] "Generic (PLEG): container finished" podID="03a186d8-ec36-41ef-b882-f0cba34a0913" containerID="4d53b7692b4000e41b46393271cc7a25c752f7158fc6a8aa7fca5d4701d5aa55" exitCode=0 Dec 08 09:05:58 crc kubenswrapper[4776]: I1208 09:05:58.004487 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqj6b" event={"ID":"03a186d8-ec36-41ef-b882-f0cba34a0913","Type":"ContainerDied","Data":"4d53b7692b4000e41b46393271cc7a25c752f7158fc6a8aa7fca5d4701d5aa55"} Dec 08 09:05:58 crc kubenswrapper[4776]: I1208 09:05:58.920266 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" podUID="5bda5be9-3cbb-4005-8c2b-a740d8a7ab57" containerName="registry" containerID="cri-o://c5841f37747c1c44c8558f8149dbb1dab60f47777cd6528d332a6505ec664a1c" gracePeriod=30 Dec 08 09:05:59 crc kubenswrapper[4776]: I1208 09:05:59.807412 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xbmh8" Dec 08 09:05:59 crc kubenswrapper[4776]: I1208 09:05:59.807731 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xbmh8" Dec 08 09:05:59 crc kubenswrapper[4776]: I1208 09:05:59.863588 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xbmh8" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.016182 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txmws" event={"ID":"aeec433a-2b07-4008-9829-d266f85b5cf1","Type":"ContainerStarted","Data":"8c6f63b1aa4718c195793bec1764692bef3b981983749c48140580366fd69a20"} Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.019084 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqj6b" event={"ID":"03a186d8-ec36-41ef-b882-f0cba34a0913","Type":"ContainerStarted","Data":"2f9ece35245eaa694dc17218af5bfe28dd299144916a7d778b296a38a8e3c8d2"} Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.020648 4776 generic.go:334] "Generic (PLEG): container finished" podID="5bda5be9-3cbb-4005-8c2b-a740d8a7ab57" containerID="c5841f37747c1c44c8558f8149dbb1dab60f47777cd6528d332a6505ec664a1c" exitCode=0 Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.020684 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" event={"ID":"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57","Type":"ContainerDied","Data":"c5841f37747c1c44c8558f8149dbb1dab60f47777cd6528d332a6505ec664a1c"} Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.037506 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-txmws" podStartSLOduration=3.37140214 podStartE2EDuration="9.037484021s" podCreationTimestamp="2025-12-08 09:05:51 +0000 UTC" firstStartedPulling="2025-12-08 09:05:53.960370033 +0000 UTC m=+430.223595055" lastFinishedPulling="2025-12-08 09:05:59.626451914 +0000 UTC m=+435.889676936" observedRunningTime="2025-12-08 09:06:00.033320488 +0000 UTC m=+436.296545500" watchObservedRunningTime="2025-12-08 09:06:00.037484021 +0000 UTC m=+436.300709043" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.051383 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dqj6b" podStartSLOduration=2.286939999 podStartE2EDuration="7.051367151s" podCreationTimestamp="2025-12-08 09:05:53 +0000 UTC" firstStartedPulling="2025-12-08 09:05:54.967474725 +0000 UTC m=+431.230699737" lastFinishedPulling="2025-12-08 09:05:59.731901867 +0000 UTC m=+435.995126889" observedRunningTime="2025-12-08 09:06:00.048540044 +0000 UTC m=+436.311765066" watchObservedRunningTime="2025-12-08 09:06:00.051367151 +0000 UTC m=+436.314592173" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.066450 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xbmh8" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.497288 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.677028 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-registry-certificates\") pod \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.677090 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-installation-pull-secrets\") pod \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.677111 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-registry-tls\") pod \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.677136 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26659\" (UniqueName: \"kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-kube-api-access-26659\") pod \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.677315 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.677353 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-trusted-ca\") pod \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.677416 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-bound-sa-token\") pod \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.677437 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-ca-trust-extracted\") pod \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\" (UID: \"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57\") " Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.682056 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.682769 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.688038 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.688604 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.695577 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.698018 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-kube-api-access-26659" (OuterVolumeSpecName: "kube-api-access-26659") pod "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57"). InnerVolumeSpecName "kube-api-access-26659". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.701253 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.710768 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57" (UID: "5bda5be9-3cbb-4005-8c2b-a740d8a7ab57"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.779061 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.779103 4776 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.779113 4776 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.779122 4776 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.779132 4776 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.779139 4776 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 08 09:06:00 crc kubenswrapper[4776]: I1208 09:06:00.779150 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26659\" (UniqueName: \"kubernetes.io/projected/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57-kube-api-access-26659\") on node \"crc\" DevicePath \"\"" Dec 08 09:06:01 crc kubenswrapper[4776]: I1208 09:06:01.028518 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4zt9" event={"ID":"30120f8b-a384-40f4-9211-ba3d8b3154f0","Type":"ContainerStarted","Data":"c072bb9ea89a1695150db0de7b3edd5f8bea6c8418fceda2d002495cdc036101"} Dec 08 09:06:01 crc kubenswrapper[4776]: I1208 09:06:01.029978 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" Dec 08 09:06:01 crc kubenswrapper[4776]: I1208 09:06:01.029968 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p5sqv" event={"ID":"5bda5be9-3cbb-4005-8c2b-a740d8a7ab57","Type":"ContainerDied","Data":"462f722382159c5e98a210082c413621f3698f47b0506247320635e57459a17a"} Dec 08 09:06:01 crc kubenswrapper[4776]: I1208 09:06:01.030111 4776 scope.go:117] "RemoveContainer" containerID="c5841f37747c1c44c8558f8149dbb1dab60f47777cd6528d332a6505ec664a1c" Dec 08 09:06:01 crc kubenswrapper[4776]: I1208 09:06:01.066746 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c4zt9" podStartSLOduration=3.877443875 podStartE2EDuration="10.06672447s" podCreationTimestamp="2025-12-08 09:05:51 +0000 UTC" firstStartedPulling="2025-12-08 09:05:53.960379733 +0000 UTC m=+430.223604755" lastFinishedPulling="2025-12-08 09:06:00.149660338 +0000 UTC m=+436.412885350" observedRunningTime="2025-12-08 09:06:01.062855883 +0000 UTC m=+437.326080905" watchObservedRunningTime="2025-12-08 09:06:01.06672447 +0000 UTC m=+437.329949492" Dec 08 09:06:01 crc kubenswrapper[4776]: I1208 09:06:01.606093 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-txmws" Dec 08 09:06:01 crc kubenswrapper[4776]: I1208 09:06:01.606962 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-txmws" Dec 08 09:06:01 crc kubenswrapper[4776]: I1208 09:06:01.647604 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-txmws" Dec 08 09:06:02 crc kubenswrapper[4776]: I1208 09:06:02.005729 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p5sqv"] Dec 08 09:06:02 crc kubenswrapper[4776]: I1208 09:06:02.009954 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p5sqv"] Dec 08 09:06:02 crc kubenswrapper[4776]: I1208 09:06:02.208622 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:06:02 crc kubenswrapper[4776]: I1208 09:06:02.209553 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:06:02 crc kubenswrapper[4776]: I1208 09:06:02.354033 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bda5be9-3cbb-4005-8c2b-a740d8a7ab57" path="/var/lib/kubelet/pods/5bda5be9-3cbb-4005-8c2b-a740d8a7ab57/volumes" Dec 08 09:06:03 crc kubenswrapper[4776]: I1208 09:06:03.249531 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c4zt9" podUID="30120f8b-a384-40f4-9211-ba3d8b3154f0" containerName="registry-server" probeResult="failure" output=< Dec 08 09:06:03 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 09:06:03 crc kubenswrapper[4776]: > Dec 08 09:06:04 crc kubenswrapper[4776]: I1208 09:06:04.021488 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dqj6b" Dec 08 09:06:04 crc kubenswrapper[4776]: I1208 09:06:04.022218 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dqj6b" Dec 08 09:06:04 crc kubenswrapper[4776]: I1208 09:06:04.066027 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dqj6b" Dec 08 09:06:05 crc kubenswrapper[4776]: I1208 09:06:05.107303 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dqj6b" Dec 08 09:06:11 crc kubenswrapper[4776]: I1208 09:06:11.669612 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-txmws" Dec 08 09:06:12 crc kubenswrapper[4776]: I1208 09:06:12.254859 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:06:12 crc kubenswrapper[4776]: I1208 09:06:12.293732 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.736421 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9"] Dec 08 09:06:17 crc kubenswrapper[4776]: E1208 09:06:17.737199 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bda5be9-3cbb-4005-8c2b-a740d8a7ab57" containerName="registry" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.737216 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bda5be9-3cbb-4005-8c2b-a740d8a7ab57" containerName="registry" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.737311 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bda5be9-3cbb-4005-8c2b-a740d8a7ab57" containerName="registry" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.737653 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.739555 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.740404 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.740458 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.742601 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.742737 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.755307 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9"] Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.882988 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4ca421fc-c2f5-4e6c-a2cc-8967632bce0a-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-rllj9\" (UID: \"4ca421fc-c2f5-4e6c-a2cc-8967632bce0a\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.883049 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ca421fc-c2f5-4e6c-a2cc-8967632bce0a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-rllj9\" (UID: \"4ca421fc-c2f5-4e6c-a2cc-8967632bce0a\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.883073 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvsjg\" (UniqueName: \"kubernetes.io/projected/4ca421fc-c2f5-4e6c-a2cc-8967632bce0a-kube-api-access-dvsjg\") pod \"cluster-monitoring-operator-6d5b84845-rllj9\" (UID: \"4ca421fc-c2f5-4e6c-a2cc-8967632bce0a\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.984757 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ca421fc-c2f5-4e6c-a2cc-8967632bce0a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-rllj9\" (UID: \"4ca421fc-c2f5-4e6c-a2cc-8967632bce0a\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.984827 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvsjg\" (UniqueName: \"kubernetes.io/projected/4ca421fc-c2f5-4e6c-a2cc-8967632bce0a-kube-api-access-dvsjg\") pod \"cluster-monitoring-operator-6d5b84845-rllj9\" (UID: \"4ca421fc-c2f5-4e6c-a2cc-8967632bce0a\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.984921 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4ca421fc-c2f5-4e6c-a2cc-8967632bce0a-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-rllj9\" (UID: \"4ca421fc-c2f5-4e6c-a2cc-8967632bce0a\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.986139 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4ca421fc-c2f5-4e6c-a2cc-8967632bce0a-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-rllj9\" (UID: \"4ca421fc-c2f5-4e6c-a2cc-8967632bce0a\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9" Dec 08 09:06:17 crc kubenswrapper[4776]: I1208 09:06:17.993166 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ca421fc-c2f5-4e6c-a2cc-8967632bce0a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-rllj9\" (UID: \"4ca421fc-c2f5-4e6c-a2cc-8967632bce0a\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9" Dec 08 09:06:18 crc kubenswrapper[4776]: I1208 09:06:18.005016 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvsjg\" (UniqueName: \"kubernetes.io/projected/4ca421fc-c2f5-4e6c-a2cc-8967632bce0a-kube-api-access-dvsjg\") pod \"cluster-monitoring-operator-6d5b84845-rllj9\" (UID: \"4ca421fc-c2f5-4e6c-a2cc-8967632bce0a\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9" Dec 08 09:06:18 crc kubenswrapper[4776]: I1208 09:06:18.066779 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9" Dec 08 09:06:18 crc kubenswrapper[4776]: I1208 09:06:18.492823 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9"] Dec 08 09:06:18 crc kubenswrapper[4776]: W1208 09:06:18.494318 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ca421fc_c2f5_4e6c_a2cc_8967632bce0a.slice/crio-2c12d4c87a8903b9d0a0164e2d0b88d6edbcf2fe693c54cc8ad68b135429c9bf WatchSource:0}: Error finding container 2c12d4c87a8903b9d0a0164e2d0b88d6edbcf2fe693c54cc8ad68b135429c9bf: Status 404 returned error can't find the container with id 2c12d4c87a8903b9d0a0164e2d0b88d6edbcf2fe693c54cc8ad68b135429c9bf Dec 08 09:06:19 crc kubenswrapper[4776]: I1208 09:06:19.130087 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9" event={"ID":"4ca421fc-c2f5-4e6c-a2cc-8967632bce0a","Type":"ContainerStarted","Data":"2c12d4c87a8903b9d0a0164e2d0b88d6edbcf2fe693c54cc8ad68b135429c9bf"} Dec 08 09:06:22 crc kubenswrapper[4776]: I1208 09:06:22.139488 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fxz9"] Dec 08 09:06:22 crc kubenswrapper[4776]: I1208 09:06:22.141824 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fxz9" Dec 08 09:06:22 crc kubenswrapper[4776]: I1208 09:06:22.144800 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-zt64w" Dec 08 09:06:22 crc kubenswrapper[4776]: I1208 09:06:22.146371 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9" event={"ID":"4ca421fc-c2f5-4e6c-a2cc-8967632bce0a","Type":"ContainerStarted","Data":"0bdabf9e78c23be997f20a2b62437652ca91d68c735c3c3332a6523266bd1b38"} Dec 08 09:06:22 crc kubenswrapper[4776]: I1208 09:06:22.155250 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 08 09:06:22 crc kubenswrapper[4776]: I1208 09:06:22.157120 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fxz9"] Dec 08 09:06:22 crc kubenswrapper[4776]: I1208 09:06:22.196473 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-rllj9" podStartSLOduration=2.1012155359999998 podStartE2EDuration="5.196458388s" podCreationTimestamp="2025-12-08 09:06:17 +0000 UTC" firstStartedPulling="2025-12-08 09:06:18.495918216 +0000 UTC m=+454.759143238" lastFinishedPulling="2025-12-08 09:06:21.591161048 +0000 UTC m=+457.854386090" observedRunningTime="2025-12-08 09:06:22.193506518 +0000 UTC m=+458.456731540" watchObservedRunningTime="2025-12-08 09:06:22.196458388 +0000 UTC m=+458.459683410" Dec 08 09:06:22 crc kubenswrapper[4776]: I1208 09:06:22.242917 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9884d687-dd63-45af-8727-9f80e784452e-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8fxz9\" (UID: \"9884d687-dd63-45af-8727-9f80e784452e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fxz9" Dec 08 09:06:22 crc kubenswrapper[4776]: I1208 09:06:22.343765 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9884d687-dd63-45af-8727-9f80e784452e-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8fxz9\" (UID: \"9884d687-dd63-45af-8727-9f80e784452e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fxz9" Dec 08 09:06:22 crc kubenswrapper[4776]: I1208 09:06:22.349066 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9884d687-dd63-45af-8727-9f80e784452e-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8fxz9\" (UID: \"9884d687-dd63-45af-8727-9f80e784452e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fxz9" Dec 08 09:06:22 crc kubenswrapper[4776]: I1208 09:06:22.467797 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fxz9" Dec 08 09:06:22 crc kubenswrapper[4776]: I1208 09:06:22.912274 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fxz9"] Dec 08 09:06:22 crc kubenswrapper[4776]: I1208 09:06:22.921511 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 09:06:23 crc kubenswrapper[4776]: I1208 09:06:23.152062 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fxz9" event={"ID":"9884d687-dd63-45af-8727-9f80e784452e","Type":"ContainerStarted","Data":"909c06b24a720e4228613ea234216d459966a10a82de8cd073016cbc7968c26e"} Dec 08 09:06:26 crc kubenswrapper[4776]: I1208 09:06:26.169029 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fxz9" event={"ID":"9884d687-dd63-45af-8727-9f80e784452e","Type":"ContainerStarted","Data":"67d8fd08e0df6a788e8dd46164863889a1b138c39d3f2d3fdad4e02f6a9d9a2d"} Dec 08 09:06:26 crc kubenswrapper[4776]: I1208 09:06:26.169356 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fxz9" Dec 08 09:06:26 crc kubenswrapper[4776]: I1208 09:06:26.177020 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fxz9" Dec 08 09:06:26 crc kubenswrapper[4776]: I1208 09:06:26.182361 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fxz9" podStartSLOduration=1.937995236 podStartE2EDuration="4.182340812s" podCreationTimestamp="2025-12-08 09:06:22 +0000 UTC" firstStartedPulling="2025-12-08 09:06:22.921297257 +0000 UTC m=+459.184522279" lastFinishedPulling="2025-12-08 09:06:25.165642823 +0000 UTC m=+461.428867855" observedRunningTime="2025-12-08 09:06:26.182113356 +0000 UTC m=+462.445338378" watchObservedRunningTime="2025-12-08 09:06:26.182340812 +0000 UTC m=+462.445565834" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.201004 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-z7zlc"] Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.202907 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.207453 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-x7cvm" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.207523 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.207455 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.208254 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.213887 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-z7zlc"] Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.309875 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw5qk\" (UniqueName: \"kubernetes.io/projected/77f5bf58-670f-470c-8ce9-416a3c52ad22-kube-api-access-pw5qk\") pod \"prometheus-operator-db54df47d-z7zlc\" (UID: \"77f5bf58-670f-470c-8ce9-416a3c52ad22\") " pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.309934 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/77f5bf58-670f-470c-8ce9-416a3c52ad22-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-z7zlc\" (UID: \"77f5bf58-670f-470c-8ce9-416a3c52ad22\") " pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.309953 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/77f5bf58-670f-470c-8ce9-416a3c52ad22-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-z7zlc\" (UID: \"77f5bf58-670f-470c-8ce9-416a3c52ad22\") " pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.310011 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77f5bf58-670f-470c-8ce9-416a3c52ad22-metrics-client-ca\") pod \"prometheus-operator-db54df47d-z7zlc\" (UID: \"77f5bf58-670f-470c-8ce9-416a3c52ad22\") " pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.411467 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77f5bf58-670f-470c-8ce9-416a3c52ad22-metrics-client-ca\") pod \"prometheus-operator-db54df47d-z7zlc\" (UID: \"77f5bf58-670f-470c-8ce9-416a3c52ad22\") " pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.411613 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw5qk\" (UniqueName: \"kubernetes.io/projected/77f5bf58-670f-470c-8ce9-416a3c52ad22-kube-api-access-pw5qk\") pod \"prometheus-operator-db54df47d-z7zlc\" (UID: \"77f5bf58-670f-470c-8ce9-416a3c52ad22\") " pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.411652 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/77f5bf58-670f-470c-8ce9-416a3c52ad22-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-z7zlc\" (UID: \"77f5bf58-670f-470c-8ce9-416a3c52ad22\") " pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.411678 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/77f5bf58-670f-470c-8ce9-416a3c52ad22-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-z7zlc\" (UID: \"77f5bf58-670f-470c-8ce9-416a3c52ad22\") " pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" Dec 08 09:06:27 crc kubenswrapper[4776]: E1208 09:06:27.411867 4776 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Dec 08 09:06:27 crc kubenswrapper[4776]: E1208 09:06:27.411983 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77f5bf58-670f-470c-8ce9-416a3c52ad22-prometheus-operator-tls podName:77f5bf58-670f-470c-8ce9-416a3c52ad22 nodeName:}" failed. No retries permitted until 2025-12-08 09:06:27.911963343 +0000 UTC m=+464.175188365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/77f5bf58-670f-470c-8ce9-416a3c52ad22-prometheus-operator-tls") pod "prometheus-operator-db54df47d-z7zlc" (UID: "77f5bf58-670f-470c-8ce9-416a3c52ad22") : secret "prometheus-operator-tls" not found Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.412464 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77f5bf58-670f-470c-8ce9-416a3c52ad22-metrics-client-ca\") pod \"prometheus-operator-db54df47d-z7zlc\" (UID: \"77f5bf58-670f-470c-8ce9-416a3c52ad22\") " pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.419135 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/77f5bf58-670f-470c-8ce9-416a3c52ad22-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-z7zlc\" (UID: \"77f5bf58-670f-470c-8ce9-416a3c52ad22\") " pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.432553 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw5qk\" (UniqueName: \"kubernetes.io/projected/77f5bf58-670f-470c-8ce9-416a3c52ad22-kube-api-access-pw5qk\") pod \"prometheus-operator-db54df47d-z7zlc\" (UID: \"77f5bf58-670f-470c-8ce9-416a3c52ad22\") " pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.917687 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/77f5bf58-670f-470c-8ce9-416a3c52ad22-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-z7zlc\" (UID: \"77f5bf58-670f-470c-8ce9-416a3c52ad22\") " pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" Dec 08 09:06:27 crc kubenswrapper[4776]: I1208 09:06:27.922830 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/77f5bf58-670f-470c-8ce9-416a3c52ad22-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-z7zlc\" (UID: \"77f5bf58-670f-470c-8ce9-416a3c52ad22\") " pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" Dec 08 09:06:28 crc kubenswrapper[4776]: I1208 09:06:28.120628 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" Dec 08 09:06:28 crc kubenswrapper[4776]: I1208 09:06:28.526850 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-z7zlc"] Dec 08 09:06:29 crc kubenswrapper[4776]: I1208 09:06:29.185191 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" event={"ID":"77f5bf58-670f-470c-8ce9-416a3c52ad22","Type":"ContainerStarted","Data":"d88b0dc2208d341cb23a7cdf31629d62eb9ed31dd8deaec2427554188011ead8"} Dec 08 09:06:31 crc kubenswrapper[4776]: I1208 09:06:31.195831 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" event={"ID":"77f5bf58-670f-470c-8ce9-416a3c52ad22","Type":"ContainerStarted","Data":"ef6a5888bfac4090f27084c2f8cccc1d1397b9195f919c9d94b98d86ecee3181"} Dec 08 09:06:32 crc kubenswrapper[4776]: I1208 09:06:32.204305 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" event={"ID":"77f5bf58-670f-470c-8ce9-416a3c52ad22","Type":"ContainerStarted","Data":"9e3bdcadce365ecac510c2b5f572319e4ba87fc676a3260aecc8344d0bb587b4"} Dec 08 09:06:32 crc kubenswrapper[4776]: I1208 09:06:32.222068 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-z7zlc" podStartSLOduration=2.732703787 podStartE2EDuration="5.222046641s" podCreationTimestamp="2025-12-08 09:06:27 +0000 UTC" firstStartedPulling="2025-12-08 09:06:28.53689545 +0000 UTC m=+464.800120472" lastFinishedPulling="2025-12-08 09:06:31.026238304 +0000 UTC m=+467.289463326" observedRunningTime="2025-12-08 09:06:32.219500271 +0000 UTC m=+468.482725293" watchObservedRunningTime="2025-12-08 09:06:32.222046641 +0000 UTC m=+468.485271663" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.576272 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-mpf6m"] Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.577484 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.608236 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-f6rkz" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.608502 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.623537 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.625523 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2"] Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.626985 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.630586 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2"] Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.635621 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-77pz7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.635873 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.635953 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.637272 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7"] Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.638597 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.640998 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.642366 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.642625 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.642873 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-r9kbv" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.652690 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7"] Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.710886 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50f69926-f0e6-404e-b2ac-e95320410132-sys\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.710943 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50f69926-f0e6-404e-b2ac-e95320410132-metrics-client-ca\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.710975 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/281eee4e-e67e-4157-8de8-3738ca845b09-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-5hvn2\" (UID: \"281eee4e-e67e-4157-8de8-3738ca845b09\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.710994 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jzxg\" (UniqueName: \"kubernetes.io/projected/50f69926-f0e6-404e-b2ac-e95320410132-kube-api-access-7jzxg\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.711158 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/50f69926-f0e6-404e-b2ac-e95320410132-node-exporter-tls\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.711236 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5lxx\" (UniqueName: \"kubernetes.io/projected/281eee4e-e67e-4157-8de8-3738ca845b09-kube-api-access-q5lxx\") pod \"openshift-state-metrics-566fddb674-5hvn2\" (UID: \"281eee4e-e67e-4157-8de8-3738ca845b09\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.711267 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/50f69926-f0e6-404e-b2ac-e95320410132-root\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.711297 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/50f69926-f0e6-404e-b2ac-e95320410132-node-exporter-wtmp\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.711325 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/50f69926-f0e6-404e-b2ac-e95320410132-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.711366 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/281eee4e-e67e-4157-8de8-3738ca845b09-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-5hvn2\" (UID: \"281eee4e-e67e-4157-8de8-3738ca845b09\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.711394 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/50f69926-f0e6-404e-b2ac-e95320410132-node-exporter-textfile\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.711436 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/281eee4e-e67e-4157-8de8-3738ca845b09-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-5hvn2\" (UID: \"281eee4e-e67e-4157-8de8-3738ca845b09\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.812152 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/882f75c0-25ca-4dee-af83-c4bf5cf295e6-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.812217 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/50f69926-f0e6-404e-b2ac-e95320410132-node-exporter-tls\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.812272 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/882f75c0-25ca-4dee-af83-c4bf5cf295e6-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.812317 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5lxx\" (UniqueName: \"kubernetes.io/projected/281eee4e-e67e-4157-8de8-3738ca845b09-kube-api-access-q5lxx\") pod \"openshift-state-metrics-566fddb674-5hvn2\" (UID: \"281eee4e-e67e-4157-8de8-3738ca845b09\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.812339 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbdtb\" (UniqueName: \"kubernetes.io/projected/882f75c0-25ca-4dee-af83-c4bf5cf295e6-kube-api-access-fbdtb\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.812362 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/50f69926-f0e6-404e-b2ac-e95320410132-root\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.812379 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/50f69926-f0e6-404e-b2ac-e95320410132-node-exporter-wtmp\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.812396 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/50f69926-f0e6-404e-b2ac-e95320410132-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.812416 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/882f75c0-25ca-4dee-af83-c4bf5cf295e6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.812436 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/882f75c0-25ca-4dee-af83-c4bf5cf295e6-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.812456 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/281eee4e-e67e-4157-8de8-3738ca845b09-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-5hvn2\" (UID: \"281eee4e-e67e-4157-8de8-3738ca845b09\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.812476 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/50f69926-f0e6-404e-b2ac-e95320410132-node-exporter-textfile\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.812478 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/50f69926-f0e6-404e-b2ac-e95320410132-root\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.812668 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/50f69926-f0e6-404e-b2ac-e95320410132-node-exporter-wtmp\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.812491 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/281eee4e-e67e-4157-8de8-3738ca845b09-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-5hvn2\" (UID: \"281eee4e-e67e-4157-8de8-3738ca845b09\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.813260 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/50f69926-f0e6-404e-b2ac-e95320410132-node-exporter-textfile\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.813395 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/882f75c0-25ca-4dee-af83-c4bf5cf295e6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.813467 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/281eee4e-e67e-4157-8de8-3738ca845b09-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-5hvn2\" (UID: \"281eee4e-e67e-4157-8de8-3738ca845b09\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.813479 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50f69926-f0e6-404e-b2ac-e95320410132-sys\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.813505 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50f69926-f0e6-404e-b2ac-e95320410132-sys\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.813526 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50f69926-f0e6-404e-b2ac-e95320410132-metrics-client-ca\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.813568 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/281eee4e-e67e-4157-8de8-3738ca845b09-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-5hvn2\" (UID: \"281eee4e-e67e-4157-8de8-3738ca845b09\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.813587 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jzxg\" (UniqueName: \"kubernetes.io/projected/50f69926-f0e6-404e-b2ac-e95320410132-kube-api-access-7jzxg\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: E1208 09:06:34.814469 4776 secret.go:188] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Dec 08 09:06:34 crc kubenswrapper[4776]: E1208 09:06:34.814545 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281eee4e-e67e-4157-8de8-3738ca845b09-openshift-state-metrics-tls podName:281eee4e-e67e-4157-8de8-3738ca845b09 nodeName:}" failed. No retries permitted until 2025-12-08 09:06:35.314517294 +0000 UTC m=+471.577742316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/281eee4e-e67e-4157-8de8-3738ca845b09-openshift-state-metrics-tls") pod "openshift-state-metrics-566fddb674-5hvn2" (UID: "281eee4e-e67e-4157-8de8-3738ca845b09") : secret "openshift-state-metrics-tls" not found Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.814851 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50f69926-f0e6-404e-b2ac-e95320410132-metrics-client-ca\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.821376 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/50f69926-f0e6-404e-b2ac-e95320410132-node-exporter-tls\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.831095 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5lxx\" (UniqueName: \"kubernetes.io/projected/281eee4e-e67e-4157-8de8-3738ca845b09-kube-api-access-q5lxx\") pod \"openshift-state-metrics-566fddb674-5hvn2\" (UID: \"281eee4e-e67e-4157-8de8-3738ca845b09\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.835075 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jzxg\" (UniqueName: \"kubernetes.io/projected/50f69926-f0e6-404e-b2ac-e95320410132-kube-api-access-7jzxg\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.837587 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/50f69926-f0e6-404e-b2ac-e95320410132-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mpf6m\" (UID: \"50f69926-f0e6-404e-b2ac-e95320410132\") " pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.843709 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/281eee4e-e67e-4157-8de8-3738ca845b09-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-5hvn2\" (UID: \"281eee4e-e67e-4157-8de8-3738ca845b09\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.914489 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/882f75c0-25ca-4dee-af83-c4bf5cf295e6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.914852 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/882f75c0-25ca-4dee-af83-c4bf5cf295e6-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.914919 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/882f75c0-25ca-4dee-af83-c4bf5cf295e6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.915001 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/882f75c0-25ca-4dee-af83-c4bf5cf295e6-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.915034 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/882f75c0-25ca-4dee-af83-c4bf5cf295e6-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.915061 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbdtb\" (UniqueName: \"kubernetes.io/projected/882f75c0-25ca-4dee-af83-c4bf5cf295e6-kube-api-access-fbdtb\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.915734 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/882f75c0-25ca-4dee-af83-c4bf5cf295e6-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.915821 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/882f75c0-25ca-4dee-af83-c4bf5cf295e6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.916454 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/882f75c0-25ca-4dee-af83-c4bf5cf295e6-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.918220 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/882f75c0-25ca-4dee-af83-c4bf5cf295e6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.918345 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/882f75c0-25ca-4dee-af83-c4bf5cf295e6-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.928244 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mpf6m" Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.943818 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbdtb\" (UniqueName: \"kubernetes.io/projected/882f75c0-25ca-4dee-af83-c4bf5cf295e6-kube-api-access-fbdtb\") pod \"kube-state-metrics-777cb5bd5d-65pk7\" (UID: \"882f75c0-25ca-4dee-af83-c4bf5cf295e6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:34 crc kubenswrapper[4776]: W1208 09:06:34.948044 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50f69926_f0e6_404e_b2ac_e95320410132.slice/crio-9a615ffc44670f052d19c3374a1e895f512b8b607b534f568ad1aef45722dcbd WatchSource:0}: Error finding container 9a615ffc44670f052d19c3374a1e895f512b8b607b534f568ad1aef45722dcbd: Status 404 returned error can't find the container with id 9a615ffc44670f052d19c3374a1e895f512b8b607b534f568ad1aef45722dcbd Dec 08 09:06:34 crc kubenswrapper[4776]: I1208 09:06:34.959206 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.220213 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mpf6m" event={"ID":"50f69926-f0e6-404e-b2ac-e95320410132","Type":"ContainerStarted","Data":"9a615ffc44670f052d19c3374a1e895f512b8b607b534f568ad1aef45722dcbd"} Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.320304 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/281eee4e-e67e-4157-8de8-3738ca845b09-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-5hvn2\" (UID: \"281eee4e-e67e-4157-8de8-3738ca845b09\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.329418 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/281eee4e-e67e-4157-8de8-3738ca845b09-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-5hvn2\" (UID: \"281eee4e-e67e-4157-8de8-3738ca845b09\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.349920 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7"] Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.551738 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.662765 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.671463 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.676123 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.677058 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.677217 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.677295 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.677407 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.681249 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.681406 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.681498 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.681524 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.688238 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-m7wj2" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.835944 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-web-config\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.836001 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0db156e-b1cd-4b14-9ba1-8027e5516672-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.836078 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0db156e-b1cd-4b14-9ba1-8027e5516672-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.836166 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.836280 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.836339 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e0db156e-b1cd-4b14-9ba1-8027e5516672-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.836411 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e0db156e-b1cd-4b14-9ba1-8027e5516672-config-out\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.836441 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e0db156e-b1cd-4b14-9ba1-8027e5516672-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.836507 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9jwg\" (UniqueName: \"kubernetes.io/projected/e0db156e-b1cd-4b14-9ba1-8027e5516672-kube-api-access-k9jwg\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.836533 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.836587 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.836615 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-config-volume\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.937731 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.937776 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-config-volume\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.937819 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-web-config\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.937839 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0db156e-b1cd-4b14-9ba1-8027e5516672-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.937865 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0db156e-b1cd-4b14-9ba1-8027e5516672-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.937897 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.937921 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.937940 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e0db156e-b1cd-4b14-9ba1-8027e5516672-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.937961 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e0db156e-b1cd-4b14-9ba1-8027e5516672-config-out\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.937975 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e0db156e-b1cd-4b14-9ba1-8027e5516672-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.938000 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9jwg\" (UniqueName: \"kubernetes.io/projected/e0db156e-b1cd-4b14-9ba1-8027e5516672-kube-api-access-k9jwg\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.938019 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.939226 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e0db156e-b1cd-4b14-9ba1-8027e5516672-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.939887 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0db156e-b1cd-4b14-9ba1-8027e5516672-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.940608 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0db156e-b1cd-4b14-9ba1-8027e5516672-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.942378 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.942665 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-config-volume\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.943061 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.947886 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.948092 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e0db156e-b1cd-4b14-9ba1-8027e5516672-config-out\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.949009 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.953837 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9jwg\" (UniqueName: \"kubernetes.io/projected/e0db156e-b1cd-4b14-9ba1-8027e5516672-kube-api-access-k9jwg\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.959558 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e0db156e-b1cd-4b14-9ba1-8027e5516672-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.960601 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e0db156e-b1cd-4b14-9ba1-8027e5516672-web-config\") pod \"alertmanager-main-0\" (UID: \"e0db156e-b1cd-4b14-9ba1-8027e5516672\") " pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:35 crc kubenswrapper[4776]: I1208 09:06:35.996438 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.226262 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" event={"ID":"882f75c0-25ca-4dee-af83-c4bf5cf295e6","Type":"ContainerStarted","Data":"db0a987cd9bf30782ba5afd24f02a1d97c0ee1389ebbee511b62edb5293f3ce2"} Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.449131 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.522031 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-55b6bb7c7-lfptj"] Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.523551 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.526343 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-vqbh5" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.526402 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.526681 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-esma5dljt9vi9" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.527320 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.527373 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.528384 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.528545 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.533290 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-55b6bb7c7-lfptj"] Dec 08 09:06:36 crc kubenswrapper[4776]: W1208 09:06:36.620805 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0db156e_b1cd_4b14_9ba1_8027e5516672.slice/crio-deb9d4ecb2650dbd9b001d115af301e47ea6e861d6cb833c41bbeed073c8ba60 WatchSource:0}: Error finding container deb9d4ecb2650dbd9b001d115af301e47ea6e861d6cb833c41bbeed073c8ba60: Status 404 returned error can't find the container with id deb9d4ecb2650dbd9b001d115af301e47ea6e861d6cb833c41bbeed073c8ba60 Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.625890 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2"] Dec 08 09:06:36 crc kubenswrapper[4776]: W1208 09:06:36.633316 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod281eee4e_e67e_4157_8de8_3738ca845b09.slice/crio-3c05a085a3b1aa9b1beb1d95e0c3b635573b2a4ab77e840f707a8149e0becf4c WatchSource:0}: Error finding container 3c05a085a3b1aa9b1beb1d95e0c3b635573b2a4ab77e840f707a8149e0becf4c: Status 404 returned error can't find the container with id 3c05a085a3b1aa9b1beb1d95e0c3b635573b2a4ab77e840f707a8149e0becf4c Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.646201 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.646246 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9crd\" (UniqueName: \"kubernetes.io/projected/b65ab5f4-b464-4d05-92da-5a274c3ac92d-kube-api-access-h9crd\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.646272 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-thanos-querier-tls\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.646292 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.646323 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.646420 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b65ab5f4-b464-4d05-92da-5a274c3ac92d-metrics-client-ca\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.646480 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-grpc-tls\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.646538 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.747918 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.748912 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9crd\" (UniqueName: \"kubernetes.io/projected/b65ab5f4-b464-4d05-92da-5a274c3ac92d-kube-api-access-h9crd\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.748945 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-thanos-querier-tls\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.748973 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.749014 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.749077 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b65ab5f4-b464-4d05-92da-5a274c3ac92d-metrics-client-ca\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.749112 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-grpc-tls\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.749152 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.750415 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b65ab5f4-b464-4d05-92da-5a274c3ac92d-metrics-client-ca\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.758048 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.758140 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-grpc-tls\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.758433 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.758511 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.758613 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-thanos-querier-tls\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.758839 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b65ab5f4-b464-4d05-92da-5a274c3ac92d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.764328 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9crd\" (UniqueName: \"kubernetes.io/projected/b65ab5f4-b464-4d05-92da-5a274c3ac92d-kube-api-access-h9crd\") pod \"thanos-querier-55b6bb7c7-lfptj\" (UID: \"b65ab5f4-b464-4d05-92da-5a274c3ac92d\") " pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:36 crc kubenswrapper[4776]: I1208 09:06:36.839512 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:37 crc kubenswrapper[4776]: I1208 09:06:37.235993 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e0db156e-b1cd-4b14-9ba1-8027e5516672","Type":"ContainerStarted","Data":"deb9d4ecb2650dbd9b001d115af301e47ea6e861d6cb833c41bbeed073c8ba60"} Dec 08 09:06:37 crc kubenswrapper[4776]: I1208 09:06:37.237717 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" event={"ID":"281eee4e-e67e-4157-8de8-3738ca845b09","Type":"ContainerStarted","Data":"0ac047aeb1761c987b01be434489e91ccb209eed3a36ed7a80af6841d6d3436e"} Dec 08 09:06:37 crc kubenswrapper[4776]: I1208 09:06:37.237741 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" event={"ID":"281eee4e-e67e-4157-8de8-3738ca845b09","Type":"ContainerStarted","Data":"cb360174c1c4e442d5200fce00e663afe3a1fc340409aaee9e7005a27918c4cb"} Dec 08 09:06:37 crc kubenswrapper[4776]: I1208 09:06:37.237772 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" event={"ID":"281eee4e-e67e-4157-8de8-3738ca845b09","Type":"ContainerStarted","Data":"3c05a085a3b1aa9b1beb1d95e0c3b635573b2a4ab77e840f707a8149e0becf4c"} Dec 08 09:06:37 crc kubenswrapper[4776]: I1208 09:06:37.893965 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-55b6bb7c7-lfptj"] Dec 08 09:06:38 crc kubenswrapper[4776]: I1208 09:06:38.244103 4776 generic.go:334] "Generic (PLEG): container finished" podID="50f69926-f0e6-404e-b2ac-e95320410132" containerID="2d4fc064b6977fde5c3db0d9448a8c447d85530cea1e6f36c519c7a66b2ca85a" exitCode=0 Dec 08 09:06:38 crc kubenswrapper[4776]: I1208 09:06:38.244192 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mpf6m" event={"ID":"50f69926-f0e6-404e-b2ac-e95320410132","Type":"ContainerDied","Data":"2d4fc064b6977fde5c3db0d9448a8c447d85530cea1e6f36c519c7a66b2ca85a"} Dec 08 09:06:38 crc kubenswrapper[4776]: I1208 09:06:38.245548 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" event={"ID":"b65ab5f4-b464-4d05-92da-5a274c3ac92d","Type":"ContainerStarted","Data":"8a40fb2196bac3e4f34bdb96bcb471e469157516b846af3ba1c6d871c86bf2f1"} Dec 08 09:06:39 crc kubenswrapper[4776]: E1208 09:06:39.108680 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0db156e_b1cd_4b14_9ba1_8027e5516672.slice/crio-conmon-53b28f58f58bbb0195f1d80583856544a0b7fc31c087b545c7b46d1dcd699e38.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0db156e_b1cd_4b14_9ba1_8027e5516672.slice/crio-53b28f58f58bbb0195f1d80583856544a0b7fc31c087b545c7b46d1dcd699e38.scope\": RecentStats: unable to find data in memory cache]" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.254706 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" event={"ID":"281eee4e-e67e-4157-8de8-3738ca845b09","Type":"ContainerStarted","Data":"e6930215e6eb8bc5bf78ef0d0ffede206a92bb005ccc8447bbd0c3e0ae872fd6"} Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.257380 4776 generic.go:334] "Generic (PLEG): container finished" podID="e0db156e-b1cd-4b14-9ba1-8027e5516672" containerID="53b28f58f58bbb0195f1d80583856544a0b7fc31c087b545c7b46d1dcd699e38" exitCode=0 Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.257446 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e0db156e-b1cd-4b14-9ba1-8027e5516672","Type":"ContainerDied","Data":"53b28f58f58bbb0195f1d80583856544a0b7fc31c087b545c7b46d1dcd699e38"} Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.261518 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" event={"ID":"882f75c0-25ca-4dee-af83-c4bf5cf295e6","Type":"ContainerStarted","Data":"8049e79546107dcf8d799a6ce4ad7c644118ecdb8c5aa8d829027270f6da4df1"} Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.261562 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" event={"ID":"882f75c0-25ca-4dee-af83-c4bf5cf295e6","Type":"ContainerStarted","Data":"d9eab293618c9dece7f8b58afedd936004a3c3252459a3d50f8a2aa584d2747d"} Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.264966 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mpf6m" event={"ID":"50f69926-f0e6-404e-b2ac-e95320410132","Type":"ContainerStarted","Data":"7ce916de6b03345c993df643beace5e310a09b8bba5e4204850129eb24d4e255"} Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.265009 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mpf6m" event={"ID":"50f69926-f0e6-404e-b2ac-e95320410132","Type":"ContainerStarted","Data":"f577703d2966b3201b9cbb001206f38e2fb68ce513753e5c5f13ab9d2af0af30"} Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.274415 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hvn2" podStartSLOduration=3.3944846269999998 podStartE2EDuration="5.274393588s" podCreationTimestamp="2025-12-08 09:06:34 +0000 UTC" firstStartedPulling="2025-12-08 09:06:36.90522712 +0000 UTC m=+473.168452142" lastFinishedPulling="2025-12-08 09:06:38.785136071 +0000 UTC m=+475.048361103" observedRunningTime="2025-12-08 09:06:39.270989535 +0000 UTC m=+475.534214557" watchObservedRunningTime="2025-12-08 09:06:39.274393588 +0000 UTC m=+475.537618610" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.287754 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-mpf6m" podStartSLOduration=2.878227202 podStartE2EDuration="5.287740703s" podCreationTimestamp="2025-12-08 09:06:34 +0000 UTC" firstStartedPulling="2025-12-08 09:06:34.950839982 +0000 UTC m=+471.214065004" lastFinishedPulling="2025-12-08 09:06:37.360353483 +0000 UTC m=+473.623578505" observedRunningTime="2025-12-08 09:06:39.286210142 +0000 UTC m=+475.549435164" watchObservedRunningTime="2025-12-08 09:06:39.287740703 +0000 UTC m=+475.550965725" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.414844 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c6d8d7477-vqqb2"] Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.421072 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.450066 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c6d8d7477-vqqb2"] Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.585978 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-oauth-serving-cert\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.586028 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9lg5\" (UniqueName: \"kubernetes.io/projected/8e6675d3-9087-4401-a973-b3ffe5856e2f-kube-api-access-j9lg5\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.586073 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-serving-cert\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.586146 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-service-ca\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.586274 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-trusted-ca-bundle\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.586315 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-config\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.586335 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-oauth-config\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.687683 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-serving-cert\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.687743 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-service-ca\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.687770 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-trusted-ca-bundle\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.687800 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-config\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.687820 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-oauth-config\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.687864 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-oauth-serving-cert\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.687881 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9lg5\" (UniqueName: \"kubernetes.io/projected/8e6675d3-9087-4401-a973-b3ffe5856e2f-kube-api-access-j9lg5\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.688731 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-config\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.688789 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-service-ca\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.688828 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-trusted-ca-bundle\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.689350 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-oauth-serving-cert\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.693692 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-serving-cert\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.699748 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-oauth-config\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.704990 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9lg5\" (UniqueName: \"kubernetes.io/projected/8e6675d3-9087-4401-a973-b3ffe5856e2f-kube-api-access-j9lg5\") pod \"console-5c6d8d7477-vqqb2\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.751732 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.858023 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-848cc8f989-mlg49"] Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.858731 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.862596 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.863764 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-5zq5w" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.863925 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.864819 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.864962 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.865107 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dpnb41eml4ill" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.869739 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-848cc8f989-mlg49"] Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.992040 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d2a2eeef-31f7-4104-972c-72db7eb01755-secret-metrics-client-certs\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.992099 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d2a2eeef-31f7-4104-972c-72db7eb01755-secret-metrics-server-tls\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.992131 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d2a2eeef-31f7-4104-972c-72db7eb01755-audit-log\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.992354 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d2a2eeef-31f7-4104-972c-72db7eb01755-metrics-server-audit-profiles\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.992429 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a2eeef-31f7-4104-972c-72db7eb01755-client-ca-bundle\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.992465 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2a2eeef-31f7-4104-972c-72db7eb01755-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:39 crc kubenswrapper[4776]: I1208 09:06:39.992559 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wwwz\" (UniqueName: \"kubernetes.io/projected/d2a2eeef-31f7-4104-972c-72db7eb01755-kube-api-access-8wwwz\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.093682 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d2a2eeef-31f7-4104-972c-72db7eb01755-secret-metrics-client-certs\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.093763 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d2a2eeef-31f7-4104-972c-72db7eb01755-secret-metrics-server-tls\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.093814 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d2a2eeef-31f7-4104-972c-72db7eb01755-audit-log\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.093873 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d2a2eeef-31f7-4104-972c-72db7eb01755-metrics-server-audit-profiles\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.093901 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a2eeef-31f7-4104-972c-72db7eb01755-client-ca-bundle\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.093930 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2a2eeef-31f7-4104-972c-72db7eb01755-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.093964 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwwz\" (UniqueName: \"kubernetes.io/projected/d2a2eeef-31f7-4104-972c-72db7eb01755-kube-api-access-8wwwz\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.095798 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d2a2eeef-31f7-4104-972c-72db7eb01755-audit-log\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.096337 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2a2eeef-31f7-4104-972c-72db7eb01755-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.096876 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d2a2eeef-31f7-4104-972c-72db7eb01755-metrics-server-audit-profiles\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.099818 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d2a2eeef-31f7-4104-972c-72db7eb01755-secret-metrics-server-tls\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.100702 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a2eeef-31f7-4104-972c-72db7eb01755-client-ca-bundle\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.107404 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d2a2eeef-31f7-4104-972c-72db7eb01755-secret-metrics-client-certs\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.111385 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wwwz\" (UniqueName: \"kubernetes.io/projected/d2a2eeef-31f7-4104-972c-72db7eb01755-kube-api-access-8wwwz\") pod \"metrics-server-848cc8f989-mlg49\" (UID: \"d2a2eeef-31f7-4104-972c-72db7eb01755\") " pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.165543 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c6d8d7477-vqqb2"] Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.187296 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.306462 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" event={"ID":"882f75c0-25ca-4dee-af83-c4bf5cf295e6","Type":"ContainerStarted","Data":"e1e9d150f30b3fca770c7970737957558dd783e54546623cd4ecdaa738f6b2b6"} Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.308275 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6d8d7477-vqqb2" event={"ID":"8e6675d3-9087-4401-a973-b3ffe5856e2f","Type":"ContainerStarted","Data":"81eddc01695133c7d156ce94ef0c4c6fe11abd8cd35fbbadbb45261452870eb7"} Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.331764 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-65pk7" podStartSLOduration=2.905931269 podStartE2EDuration="6.331745959s" podCreationTimestamp="2025-12-08 09:06:34 +0000 UTC" firstStartedPulling="2025-12-08 09:06:35.359240429 +0000 UTC m=+471.622465451" lastFinishedPulling="2025-12-08 09:06:38.785055119 +0000 UTC m=+475.048280141" observedRunningTime="2025-12-08 09:06:40.329682622 +0000 UTC m=+476.592907654" watchObservedRunningTime="2025-12-08 09:06:40.331745959 +0000 UTC m=+476.594970981" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.371783 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-787d456dd8-9svrh"] Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.373110 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-787d456dd8-9svrh" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.377206 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.377340 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.381061 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-787d456dd8-9svrh"] Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.504074 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d10744d9-5819-4ca3-815c-5a8782037204-monitoring-plugin-cert\") pod \"monitoring-plugin-787d456dd8-9svrh\" (UID: \"d10744d9-5819-4ca3-815c-5a8782037204\") " pod="openshift-monitoring/monitoring-plugin-787d456dd8-9svrh" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.606254 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d10744d9-5819-4ca3-815c-5a8782037204-monitoring-plugin-cert\") pod \"monitoring-plugin-787d456dd8-9svrh\" (UID: \"d10744d9-5819-4ca3-815c-5a8782037204\") " pod="openshift-monitoring/monitoring-plugin-787d456dd8-9svrh" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.611511 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d10744d9-5819-4ca3-815c-5a8782037204-monitoring-plugin-cert\") pod \"monitoring-plugin-787d456dd8-9svrh\" (UID: \"d10744d9-5819-4ca3-815c-5a8782037204\") " pod="openshift-monitoring/monitoring-plugin-787d456dd8-9svrh" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.620970 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-848cc8f989-mlg49"] Dec 08 09:06:40 crc kubenswrapper[4776]: W1208 09:06:40.627793 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2a2eeef_31f7_4104_972c_72db7eb01755.slice/crio-7ff1de7041dfc90877b2f639c3d9423f37ee73bf3380e1dfbb1730760eb68e6a WatchSource:0}: Error finding container 7ff1de7041dfc90877b2f639c3d9423f37ee73bf3380e1dfbb1730760eb68e6a: Status 404 returned error can't find the container with id 7ff1de7041dfc90877b2f639c3d9423f37ee73bf3380e1dfbb1730760eb68e6a Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.697192 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-787d456dd8-9svrh" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.898975 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.900820 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.923557 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.923716 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.923863 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.924320 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.924439 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.926441 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.926675 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.927454 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.927509 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-79v76" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.927466 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.927694 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.931623 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-1oerr6ke53k8t" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.932235 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.933037 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Dec 08 09:06:40 crc kubenswrapper[4776]: I1208 09:06:40.984489 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-787d456dd8-9svrh"] Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.011213 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.011261 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50fef50b-6cb5-4316-b137-1cb6f462778f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.011295 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-config\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.011313 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.011336 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/50fef50b-6cb5-4316-b137-1cb6f462778f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.011440 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.011489 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.011508 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wr9x\" (UniqueName: \"kubernetes.io/projected/50fef50b-6cb5-4316-b137-1cb6f462778f-kube-api-access-5wr9x\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.011570 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50fef50b-6cb5-4316-b137-1cb6f462778f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.011594 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50fef50b-6cb5-4316-b137-1cb6f462778f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.011615 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.011652 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/50fef50b-6cb5-4316-b137-1cb6f462778f-config-out\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.011684 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/50fef50b-6cb5-4316-b137-1cb6f462778f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.011860 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-web-config\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.011906 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50fef50b-6cb5-4316-b137-1cb6f462778f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.011964 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/50fef50b-6cb5-4316-b137-1cb6f462778f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.012014 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.012056 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.114945 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/50fef50b-6cb5-4316-b137-1cb6f462778f-config-out\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115008 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/50fef50b-6cb5-4316-b137-1cb6f462778f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115041 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-web-config\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115076 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50fef50b-6cb5-4316-b137-1cb6f462778f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115104 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/50fef50b-6cb5-4316-b137-1cb6f462778f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115135 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115156 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115231 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115258 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50fef50b-6cb5-4316-b137-1cb6f462778f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115294 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-config\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115338 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115366 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/50fef50b-6cb5-4316-b137-1cb6f462778f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115389 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115417 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115438 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wr9x\" (UniqueName: \"kubernetes.io/projected/50fef50b-6cb5-4316-b137-1cb6f462778f-kube-api-access-5wr9x\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115473 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50fef50b-6cb5-4316-b137-1cb6f462778f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115498 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50fef50b-6cb5-4316-b137-1cb6f462778f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.115524 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.117567 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50fef50b-6cb5-4316-b137-1cb6f462778f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.118734 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50fef50b-6cb5-4316-b137-1cb6f462778f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.119308 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/50fef50b-6cb5-4316-b137-1cb6f462778f-config-out\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.122297 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.125089 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/50fef50b-6cb5-4316-b137-1cb6f462778f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.125571 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.125631 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.125852 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/50fef50b-6cb5-4316-b137-1cb6f462778f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.126464 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50fef50b-6cb5-4316-b137-1cb6f462778f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.126980 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.127409 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50fef50b-6cb5-4316-b137-1cb6f462778f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.127324 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-config\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.127487 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-web-config\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.129366 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.136116 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.140222 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wr9x\" (UniqueName: \"kubernetes.io/projected/50fef50b-6cb5-4316-b137-1cb6f462778f-kube-api-access-5wr9x\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.141111 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/50fef50b-6cb5-4316-b137-1cb6f462778f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.144060 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/50fef50b-6cb5-4316-b137-1cb6f462778f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"50fef50b-6cb5-4316-b137-1cb6f462778f\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.253512 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.317540 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6d8d7477-vqqb2" event={"ID":"8e6675d3-9087-4401-a973-b3ffe5856e2f","Type":"ContainerStarted","Data":"1369cf70c305c53ced1fbff9834a5fddb305a0c271115911bb57dc7c6c9ae181"} Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.319396 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" event={"ID":"d2a2eeef-31f7-4104-972c-72db7eb01755","Type":"ContainerStarted","Data":"7ff1de7041dfc90877b2f639c3d9423f37ee73bf3380e1dfbb1730760eb68e6a"} Dec 08 09:06:41 crc kubenswrapper[4776]: I1208 09:06:41.337568 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c6d8d7477-vqqb2" podStartSLOduration=2.33755185 podStartE2EDuration="2.33755185s" podCreationTimestamp="2025-12-08 09:06:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:06:41.337156049 +0000 UTC m=+477.600381091" watchObservedRunningTime="2025-12-08 09:06:41.33755185 +0000 UTC m=+477.600776872" Dec 08 09:06:42 crc kubenswrapper[4776]: I1208 09:06:42.331701 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-787d456dd8-9svrh" event={"ID":"d10744d9-5819-4ca3-815c-5a8782037204","Type":"ContainerStarted","Data":"fad12c1dab23d4e1044e03c784b4bce142513729457643d67bc91ab8e6b154e6"} Dec 08 09:06:42 crc kubenswrapper[4776]: I1208 09:06:42.659705 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 08 09:06:42 crc kubenswrapper[4776]: W1208 09:06:42.667358 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50fef50b_6cb5_4316_b137_1cb6f462778f.slice/crio-cc501488840bfaf46fd3059aa73399e79414aa2b621905951c5a0ec0b5e4770e WatchSource:0}: Error finding container cc501488840bfaf46fd3059aa73399e79414aa2b621905951c5a0ec0b5e4770e: Status 404 returned error can't find the container with id cc501488840bfaf46fd3059aa73399e79414aa2b621905951c5a0ec0b5e4770e Dec 08 09:06:43 crc kubenswrapper[4776]: I1208 09:06:43.345738 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" event={"ID":"b65ab5f4-b464-4d05-92da-5a274c3ac92d","Type":"ContainerStarted","Data":"9bf4c2b8d4385a305da6418dfdd49c50739313266dc66cf3689a0cd0d84cba93"} Dec 08 09:06:43 crc kubenswrapper[4776]: I1208 09:06:43.345789 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" event={"ID":"b65ab5f4-b464-4d05-92da-5a274c3ac92d","Type":"ContainerStarted","Data":"9108da0a9d507a25ee39d6078de017038239890cee00d8c30f5a31af115ea4e2"} Dec 08 09:06:43 crc kubenswrapper[4776]: I1208 09:06:43.345802 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" event={"ID":"b65ab5f4-b464-4d05-92da-5a274c3ac92d","Type":"ContainerStarted","Data":"9c31ceef90a60a04129c187e50780d5462df9094c2e6d92670e61e763386d37e"} Dec 08 09:06:43 crc kubenswrapper[4776]: I1208 09:06:43.348200 4776 generic.go:334] "Generic (PLEG): container finished" podID="50fef50b-6cb5-4316-b137-1cb6f462778f" containerID="3b511e3234c36f74312d487ba94c971148ef0a0ca11c9e0e35ea70db4f92f483" exitCode=0 Dec 08 09:06:43 crc kubenswrapper[4776]: I1208 09:06:43.348276 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"50fef50b-6cb5-4316-b137-1cb6f462778f","Type":"ContainerDied","Data":"3b511e3234c36f74312d487ba94c971148ef0a0ca11c9e0e35ea70db4f92f483"} Dec 08 09:06:43 crc kubenswrapper[4776]: I1208 09:06:43.348300 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"50fef50b-6cb5-4316-b137-1cb6f462778f","Type":"ContainerStarted","Data":"cc501488840bfaf46fd3059aa73399e79414aa2b621905951c5a0ec0b5e4770e"} Dec 08 09:06:43 crc kubenswrapper[4776]: I1208 09:06:43.355158 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e0db156e-b1cd-4b14-9ba1-8027e5516672","Type":"ContainerStarted","Data":"610ba39b95ce7bfbb4da7fda1d0cf3003394430d507528b7aeafcf89281f83dc"} Dec 08 09:06:43 crc kubenswrapper[4776]: I1208 09:06:43.355245 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e0db156e-b1cd-4b14-9ba1-8027e5516672","Type":"ContainerStarted","Data":"96067a140d98af3f00243efe21d7651014ae9989a01cc7dae2b16b63c0a849dd"} Dec 08 09:06:43 crc kubenswrapper[4776]: I1208 09:06:43.355259 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e0db156e-b1cd-4b14-9ba1-8027e5516672","Type":"ContainerStarted","Data":"1a4c35a126b7a16dfd72929723b1140f0368b7a99f67f9216e5307d1e6447d40"} Dec 08 09:06:43 crc kubenswrapper[4776]: I1208 09:06:43.355278 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e0db156e-b1cd-4b14-9ba1-8027e5516672","Type":"ContainerStarted","Data":"015d87614415df6f89894713daa4bd8b01baf377ece2e32f72aa3f358b0369d8"} Dec 08 09:06:43 crc kubenswrapper[4776]: I1208 09:06:43.355292 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e0db156e-b1cd-4b14-9ba1-8027e5516672","Type":"ContainerStarted","Data":"560403d0b562c3627bdefb19a23c0f3b7200132ee6b046cb8dd7166553e416bd"} Dec 08 09:06:45 crc kubenswrapper[4776]: I1208 09:06:45.367368 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" event={"ID":"d2a2eeef-31f7-4104-972c-72db7eb01755","Type":"ContainerStarted","Data":"12805c36751b7aceb553763f957ff0c720cf8924a10ba205870dcf111a42cb88"} Dec 08 09:06:45 crc kubenswrapper[4776]: I1208 09:06:45.372060 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" event={"ID":"b65ab5f4-b464-4d05-92da-5a274c3ac92d","Type":"ContainerStarted","Data":"d23536d9651ecf84f7e065f7a99821eb1122862d67ff14821a7ba739ef95b17c"} Dec 08 09:06:45 crc kubenswrapper[4776]: I1208 09:06:45.372125 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" event={"ID":"b65ab5f4-b464-4d05-92da-5a274c3ac92d","Type":"ContainerStarted","Data":"ef73ecfe605d8be52cba39b41caa0620aa2525c3089de763d757fdca340a6ac4"} Dec 08 09:06:45 crc kubenswrapper[4776]: I1208 09:06:45.372141 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" event={"ID":"b65ab5f4-b464-4d05-92da-5a274c3ac92d","Type":"ContainerStarted","Data":"610009fd0a9c6d07af1fe3b7cc89947f747e285fa42c3ed9b7e44ca1180cabf5"} Dec 08 09:06:45 crc kubenswrapper[4776]: I1208 09:06:45.372202 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:06:45 crc kubenswrapper[4776]: I1208 09:06:45.376066 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e0db156e-b1cd-4b14-9ba1-8027e5516672","Type":"ContainerStarted","Data":"02534c87eb2252a767cf29e4d8c6ecc833b70151511854382c884ab8b808be21"} Dec 08 09:06:45 crc kubenswrapper[4776]: I1208 09:06:45.377951 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-787d456dd8-9svrh" event={"ID":"d10744d9-5819-4ca3-815c-5a8782037204","Type":"ContainerStarted","Data":"0a2c0c94dbb8dcb0ad147d47eab6e198f155e3a0704772e12dc05a3bee1778e1"} Dec 08 09:06:45 crc kubenswrapper[4776]: I1208 09:06:45.378614 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-787d456dd8-9svrh" Dec 08 09:06:45 crc kubenswrapper[4776]: I1208 09:06:45.385946 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-787d456dd8-9svrh" Dec 08 09:06:45 crc kubenswrapper[4776]: I1208 09:06:45.420527 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" podStartSLOduration=2.474227486 podStartE2EDuration="6.420506906s" podCreationTimestamp="2025-12-08 09:06:39 +0000 UTC" firstStartedPulling="2025-12-08 09:06:40.631797133 +0000 UTC m=+476.895022155" lastFinishedPulling="2025-12-08 09:06:44.578076553 +0000 UTC m=+480.841301575" observedRunningTime="2025-12-08 09:06:45.388617475 +0000 UTC m=+481.651842497" watchObservedRunningTime="2025-12-08 09:06:45.420506906 +0000 UTC m=+481.683731928" Dec 08 09:06:45 crc kubenswrapper[4776]: I1208 09:06:45.442072 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-787d456dd8-9svrh" podStartSLOduration=3.098809886 podStartE2EDuration="5.442045706s" podCreationTimestamp="2025-12-08 09:06:40 +0000 UTC" firstStartedPulling="2025-12-08 09:06:42.235414609 +0000 UTC m=+478.498639631" lastFinishedPulling="2025-12-08 09:06:44.578650429 +0000 UTC m=+480.841875451" observedRunningTime="2025-12-08 09:06:45.434139959 +0000 UTC m=+481.697364991" watchObservedRunningTime="2025-12-08 09:06:45.442045706 +0000 UTC m=+481.705270728" Dec 08 09:06:45 crc kubenswrapper[4776]: I1208 09:06:45.444000 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.141423417 podStartE2EDuration="10.443989739s" podCreationTimestamp="2025-12-08 09:06:35 +0000 UTC" firstStartedPulling="2025-12-08 09:06:36.623764723 +0000 UTC m=+472.886989745" lastFinishedPulling="2025-12-08 09:06:44.926331045 +0000 UTC m=+481.189556067" observedRunningTime="2025-12-08 09:06:45.415616944 +0000 UTC m=+481.678841966" watchObservedRunningTime="2025-12-08 09:06:45.443989739 +0000 UTC m=+481.707214761" Dec 08 09:06:45 crc kubenswrapper[4776]: I1208 09:06:45.470928 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" podStartSLOduration=2.994080733 podStartE2EDuration="9.470902734s" podCreationTimestamp="2025-12-08 09:06:36 +0000 UTC" firstStartedPulling="2025-12-08 09:06:38.101213351 +0000 UTC m=+474.364438373" lastFinishedPulling="2025-12-08 09:06:44.578035352 +0000 UTC m=+480.841260374" observedRunningTime="2025-12-08 09:06:45.464622643 +0000 UTC m=+481.727847665" watchObservedRunningTime="2025-12-08 09:06:45.470902734 +0000 UTC m=+481.734127756" Dec 08 09:06:48 crc kubenswrapper[4776]: I1208 09:06:48.399435 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"50fef50b-6cb5-4316-b137-1cb6f462778f","Type":"ContainerStarted","Data":"7f91bda7e7cf2a2855fe78a999d2922191477273477f38e9be89a95e15ab2105"} Dec 08 09:06:49 crc kubenswrapper[4776]: I1208 09:06:49.407869 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"50fef50b-6cb5-4316-b137-1cb6f462778f","Type":"ContainerStarted","Data":"122c2137dd49bc3f67925953f237f43720e4f5434ae1e2c33c0a4f74e62a097e"} Dec 08 09:06:49 crc kubenswrapper[4776]: I1208 09:06:49.408238 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"50fef50b-6cb5-4316-b137-1cb6f462778f","Type":"ContainerStarted","Data":"f218755bd8ba0cfea3a95b90e0d38e552502c0ea73851d28bf2fce89817ad0d8"} Dec 08 09:06:49 crc kubenswrapper[4776]: I1208 09:06:49.408257 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"50fef50b-6cb5-4316-b137-1cb6f462778f","Type":"ContainerStarted","Data":"03b78888bbe4065964e6ed5a4947b0c03021b3caed20d8fbf653150a1ddce19d"} Dec 08 09:06:49 crc kubenswrapper[4776]: I1208 09:06:49.408270 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"50fef50b-6cb5-4316-b137-1cb6f462778f","Type":"ContainerStarted","Data":"2c8ef87f3b91b3e204a92f8890fbce2cbb9ab14329ba6125e7ff56af36feca3d"} Dec 08 09:06:49 crc kubenswrapper[4776]: I1208 09:06:49.408282 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"50fef50b-6cb5-4316-b137-1cb6f462778f","Type":"ContainerStarted","Data":"7bdecf51855e2b886a37b2fd6579e6ac6dc609aaf16dece19baffe9d11c1b2c1"} Dec 08 09:06:49 crc kubenswrapper[4776]: I1208 09:06:49.752909 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:49 crc kubenswrapper[4776]: I1208 09:06:49.752957 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:49 crc kubenswrapper[4776]: I1208 09:06:49.758108 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:49 crc kubenswrapper[4776]: I1208 09:06:49.788886 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.260180435 podStartE2EDuration="9.788864689s" podCreationTimestamp="2025-12-08 09:06:40 +0000 UTC" firstStartedPulling="2025-12-08 09:06:43.349528272 +0000 UTC m=+479.612753294" lastFinishedPulling="2025-12-08 09:06:47.878212526 +0000 UTC m=+484.141437548" observedRunningTime="2025-12-08 09:06:49.438623202 +0000 UTC m=+485.701848224" watchObservedRunningTime="2025-12-08 09:06:49.788864689 +0000 UTC m=+486.052089711" Dec 08 09:06:50 crc kubenswrapper[4776]: I1208 09:06:50.420025 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:06:50 crc kubenswrapper[4776]: I1208 09:06:50.474609 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8dm9l"] Dec 08 09:06:51 crc kubenswrapper[4776]: I1208 09:06:51.254470 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:06:51 crc kubenswrapper[4776]: I1208 09:06:51.848860 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-55b6bb7c7-lfptj" Dec 08 09:07:00 crc kubenswrapper[4776]: I1208 09:07:00.188502 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:07:00 crc kubenswrapper[4776]: I1208 09:07:00.188839 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:07:15 crc kubenswrapper[4776]: I1208 09:07:15.532406 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-8dm9l" podUID="b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7" containerName="console" containerID="cri-o://6c7eb012cb7d51a3b063f85749435e8197befa048320da5ee4fd8e3835a4e382" gracePeriod=15 Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.106108 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8dm9l_b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7/console/0.log" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.106451 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.222757 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-oauth-serving-cert\") pod \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.222831 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-serving-cert\") pod \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.222861 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-oauth-config\") pod \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.222913 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-trusted-ca-bundle\") pod \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.222969 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2wx9\" (UniqueName: \"kubernetes.io/projected/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-kube-api-access-x2wx9\") pod \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.223137 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-service-ca\") pod \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.223167 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-config\") pod \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\" (UID: \"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7\") " Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.223912 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7" (UID: "b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.223923 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-config" (OuterVolumeSpecName: "console-config") pod "b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7" (UID: "b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.224261 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-service-ca" (OuterVolumeSpecName: "service-ca") pod "b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7" (UID: "b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.224360 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7" (UID: "b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.228477 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7" (UID: "b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.228516 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-kube-api-access-x2wx9" (OuterVolumeSpecName: "kube-api-access-x2wx9") pod "b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7" (UID: "b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7"). InnerVolumeSpecName "kube-api-access-x2wx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.230355 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7" (UID: "b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.324374 4776 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.324425 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.324440 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2wx9\" (UniqueName: \"kubernetes.io/projected/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-kube-api-access-x2wx9\") on node \"crc\" DevicePath \"\"" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.324454 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-service-ca\") on node \"crc\" DevicePath \"\"" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.324465 4776 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.324475 4776 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.324486 4776 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.615370 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8dm9l_b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7/console/0.log" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.615426 4776 generic.go:334] "Generic (PLEG): container finished" podID="b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7" containerID="6c7eb012cb7d51a3b063f85749435e8197befa048320da5ee4fd8e3835a4e382" exitCode=2 Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.615467 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8dm9l" event={"ID":"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7","Type":"ContainerDied","Data":"6c7eb012cb7d51a3b063f85749435e8197befa048320da5ee4fd8e3835a4e382"} Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.615516 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8dm9l" event={"ID":"b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7","Type":"ContainerDied","Data":"2417e37c033a85edb2c7086c7ee3a96dde2b860e19053f0a335f62b734cd3c84"} Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.615534 4776 scope.go:117] "RemoveContainer" containerID="6c7eb012cb7d51a3b063f85749435e8197befa048320da5ee4fd8e3835a4e382" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.615708 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8dm9l" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.633942 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8dm9l"] Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.635899 4776 scope.go:117] "RemoveContainer" containerID="6c7eb012cb7d51a3b063f85749435e8197befa048320da5ee4fd8e3835a4e382" Dec 08 09:07:16 crc kubenswrapper[4776]: E1208 09:07:16.636495 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c7eb012cb7d51a3b063f85749435e8197befa048320da5ee4fd8e3835a4e382\": container with ID starting with 6c7eb012cb7d51a3b063f85749435e8197befa048320da5ee4fd8e3835a4e382 not found: ID does not exist" containerID="6c7eb012cb7d51a3b063f85749435e8197befa048320da5ee4fd8e3835a4e382" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.636551 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c7eb012cb7d51a3b063f85749435e8197befa048320da5ee4fd8e3835a4e382"} err="failed to get container status \"6c7eb012cb7d51a3b063f85749435e8197befa048320da5ee4fd8e3835a4e382\": rpc error: code = NotFound desc = could not find container \"6c7eb012cb7d51a3b063f85749435e8197befa048320da5ee4fd8e3835a4e382\": container with ID starting with 6c7eb012cb7d51a3b063f85749435e8197befa048320da5ee4fd8e3835a4e382 not found: ID does not exist" Dec 08 09:07:16 crc kubenswrapper[4776]: I1208 09:07:16.638207 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-8dm9l"] Dec 08 09:07:18 crc kubenswrapper[4776]: I1208 09:07:18.357320 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7" path="/var/lib/kubelet/pods/b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7/volumes" Dec 08 09:07:20 crc kubenswrapper[4776]: I1208 09:07:20.193964 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:07:20 crc kubenswrapper[4776]: I1208 09:07:20.198348 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-848cc8f989-mlg49" Dec 08 09:07:41 crc kubenswrapper[4776]: I1208 09:07:41.254228 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:07:41 crc kubenswrapper[4776]: I1208 09:07:41.285972 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:07:41 crc kubenswrapper[4776]: I1208 09:07:41.816951 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Dec 08 09:07:44 crc kubenswrapper[4776]: I1208 09:07:44.623488 4776 scope.go:117] "RemoveContainer" containerID="3fba4c7e859ac4f2fa12d87d3a1a5ca36e5495dd91e991f29e391c8acf541069" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.517483 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-796dfb4c97-zckps"] Dec 08 09:07:57 crc kubenswrapper[4776]: E1208 09:07:57.518593 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7" containerName="console" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.518611 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7" containerName="console" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.518746 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="b933dd2b-2d11-49c1-b7d0-c9ffb69ac3d7" containerName="console" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.519341 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.542138 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-796dfb4c97-zckps"] Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.583423 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-oauth-serving-cert\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.583505 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-service-ca\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.583540 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzmm7\" (UniqueName: \"kubernetes.io/projected/aa22c984-c7d5-497f-b165-484bd945318c-kube-api-access-wzmm7\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.583568 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-console-config\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.583699 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-trusted-ca-bundle\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.583733 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa22c984-c7d5-497f-b165-484bd945318c-console-oauth-config\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.583801 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa22c984-c7d5-497f-b165-484bd945318c-console-serving-cert\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.684496 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-oauth-serving-cert\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.684612 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-service-ca\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.684656 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzmm7\" (UniqueName: \"kubernetes.io/projected/aa22c984-c7d5-497f-b165-484bd945318c-kube-api-access-wzmm7\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.684688 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-console-config\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.684756 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-trusted-ca-bundle\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.684794 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa22c984-c7d5-497f-b165-484bd945318c-console-oauth-config\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.684864 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa22c984-c7d5-497f-b165-484bd945318c-console-serving-cert\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.685803 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-service-ca\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.685832 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-console-config\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.685854 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-oauth-serving-cert\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.686063 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-trusted-ca-bundle\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.691609 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa22c984-c7d5-497f-b165-484bd945318c-console-serving-cert\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.692601 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa22c984-c7d5-497f-b165-484bd945318c-console-oauth-config\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.704015 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzmm7\" (UniqueName: \"kubernetes.io/projected/aa22c984-c7d5-497f-b165-484bd945318c-kube-api-access-wzmm7\") pod \"console-796dfb4c97-zckps\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:57 crc kubenswrapper[4776]: I1208 09:07:57.845795 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:07:58 crc kubenswrapper[4776]: I1208 09:07:58.128070 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-796dfb4c97-zckps"] Dec 08 09:07:58 crc kubenswrapper[4776]: I1208 09:07:58.228153 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-796dfb4c97-zckps" event={"ID":"aa22c984-c7d5-497f-b165-484bd945318c","Type":"ContainerStarted","Data":"d0853a1efba37437854e67d0918eb49f6d3ab215a6a2d31e375fbace68e731a9"} Dec 08 09:07:59 crc kubenswrapper[4776]: I1208 09:07:59.236051 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-796dfb4c97-zckps" event={"ID":"aa22c984-c7d5-497f-b165-484bd945318c","Type":"ContainerStarted","Data":"f1853047fafe7bea95051435c12647845ee744a4c7c2672f21a97874c721b856"} Dec 08 09:07:59 crc kubenswrapper[4776]: I1208 09:07:59.259841 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-796dfb4c97-zckps" podStartSLOduration=2.259814608 podStartE2EDuration="2.259814608s" podCreationTimestamp="2025-12-08 09:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:07:59.257849136 +0000 UTC m=+555.521074158" watchObservedRunningTime="2025-12-08 09:07:59.259814608 +0000 UTC m=+555.523039630" Dec 08 09:08:07 crc kubenswrapper[4776]: I1208 09:08:07.846676 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:08:07 crc kubenswrapper[4776]: I1208 09:08:07.848223 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:08:07 crc kubenswrapper[4776]: I1208 09:08:07.852301 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:08:08 crc kubenswrapper[4776]: I1208 09:08:08.299566 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:08:08 crc kubenswrapper[4776]: I1208 09:08:08.391808 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c6d8d7477-vqqb2"] Dec 08 09:08:11 crc kubenswrapper[4776]: I1208 09:08:11.398973 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:08:11 crc kubenswrapper[4776]: I1208 09:08:11.399382 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.448012 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-5c6d8d7477-vqqb2" podUID="8e6675d3-9087-4401-a973-b3ffe5856e2f" containerName="console" containerID="cri-o://1369cf70c305c53ced1fbff9834a5fddb305a0c271115911bb57dc7c6c9ae181" gracePeriod=15 Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.762273 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c6d8d7477-vqqb2_8e6675d3-9087-4401-a973-b3ffe5856e2f/console/0.log" Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.762684 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.897402 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-service-ca\") pod \"8e6675d3-9087-4401-a973-b3ffe5856e2f\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.897495 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9lg5\" (UniqueName: \"kubernetes.io/projected/8e6675d3-9087-4401-a973-b3ffe5856e2f-kube-api-access-j9lg5\") pod \"8e6675d3-9087-4401-a973-b3ffe5856e2f\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.897620 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-trusted-ca-bundle\") pod \"8e6675d3-9087-4401-a973-b3ffe5856e2f\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.897647 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-serving-cert\") pod \"8e6675d3-9087-4401-a973-b3ffe5856e2f\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.897695 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-oauth-serving-cert\") pod \"8e6675d3-9087-4401-a973-b3ffe5856e2f\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.897715 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-oauth-config\") pod \"8e6675d3-9087-4401-a973-b3ffe5856e2f\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.897731 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-config\") pod \"8e6675d3-9087-4401-a973-b3ffe5856e2f\" (UID: \"8e6675d3-9087-4401-a973-b3ffe5856e2f\") " Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.898463 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-service-ca" (OuterVolumeSpecName: "service-ca") pod "8e6675d3-9087-4401-a973-b3ffe5856e2f" (UID: "8e6675d3-9087-4401-a973-b3ffe5856e2f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.898665 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-config" (OuterVolumeSpecName: "console-config") pod "8e6675d3-9087-4401-a973-b3ffe5856e2f" (UID: "8e6675d3-9087-4401-a973-b3ffe5856e2f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.898881 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8e6675d3-9087-4401-a973-b3ffe5856e2f" (UID: "8e6675d3-9087-4401-a973-b3ffe5856e2f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.898903 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8e6675d3-9087-4401-a973-b3ffe5856e2f" (UID: "8e6675d3-9087-4401-a973-b3ffe5856e2f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.909397 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8e6675d3-9087-4401-a973-b3ffe5856e2f" (UID: "8e6675d3-9087-4401-a973-b3ffe5856e2f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.909448 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e6675d3-9087-4401-a973-b3ffe5856e2f-kube-api-access-j9lg5" (OuterVolumeSpecName: "kube-api-access-j9lg5") pod "8e6675d3-9087-4401-a973-b3ffe5856e2f" (UID: "8e6675d3-9087-4401-a973-b3ffe5856e2f"). InnerVolumeSpecName "kube-api-access-j9lg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.910305 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8e6675d3-9087-4401-a973-b3ffe5856e2f" (UID: "8e6675d3-9087-4401-a973-b3ffe5856e2f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.998937 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9lg5\" (UniqueName: \"kubernetes.io/projected/8e6675d3-9087-4401-a973-b3ffe5856e2f-kube-api-access-j9lg5\") on node \"crc\" DevicePath \"\"" Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.999273 4776 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.999284 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.999292 4776 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.999300 4776 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.999308 4776 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-console-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:08:33 crc kubenswrapper[4776]: I1208 09:08:33.999316 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e6675d3-9087-4401-a973-b3ffe5856e2f-service-ca\") on node \"crc\" DevicePath \"\"" Dec 08 09:08:34 crc kubenswrapper[4776]: I1208 09:08:34.448143 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c6d8d7477-vqqb2_8e6675d3-9087-4401-a973-b3ffe5856e2f/console/0.log" Dec 08 09:08:34 crc kubenswrapper[4776]: I1208 09:08:34.448223 4776 generic.go:334] "Generic (PLEG): container finished" podID="8e6675d3-9087-4401-a973-b3ffe5856e2f" containerID="1369cf70c305c53ced1fbff9834a5fddb305a0c271115911bb57dc7c6c9ae181" exitCode=2 Dec 08 09:08:34 crc kubenswrapper[4776]: I1208 09:08:34.448255 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6d8d7477-vqqb2" event={"ID":"8e6675d3-9087-4401-a973-b3ffe5856e2f","Type":"ContainerDied","Data":"1369cf70c305c53ced1fbff9834a5fddb305a0c271115911bb57dc7c6c9ae181"} Dec 08 09:08:34 crc kubenswrapper[4776]: I1208 09:08:34.448284 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6d8d7477-vqqb2" event={"ID":"8e6675d3-9087-4401-a973-b3ffe5856e2f","Type":"ContainerDied","Data":"81eddc01695133c7d156ce94ef0c4c6fe11abd8cd35fbbadbb45261452870eb7"} Dec 08 09:08:34 crc kubenswrapper[4776]: I1208 09:08:34.448326 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6d8d7477-vqqb2" Dec 08 09:08:34 crc kubenswrapper[4776]: I1208 09:08:34.448327 4776 scope.go:117] "RemoveContainer" containerID="1369cf70c305c53ced1fbff9834a5fddb305a0c271115911bb57dc7c6c9ae181" Dec 08 09:08:34 crc kubenswrapper[4776]: I1208 09:08:34.471965 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c6d8d7477-vqqb2"] Dec 08 09:08:34 crc kubenswrapper[4776]: I1208 09:08:34.472720 4776 scope.go:117] "RemoveContainer" containerID="1369cf70c305c53ced1fbff9834a5fddb305a0c271115911bb57dc7c6c9ae181" Dec 08 09:08:34 crc kubenswrapper[4776]: E1208 09:08:34.473598 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1369cf70c305c53ced1fbff9834a5fddb305a0c271115911bb57dc7c6c9ae181\": container with ID starting with 1369cf70c305c53ced1fbff9834a5fddb305a0c271115911bb57dc7c6c9ae181 not found: ID does not exist" containerID="1369cf70c305c53ced1fbff9834a5fddb305a0c271115911bb57dc7c6c9ae181" Dec 08 09:08:34 crc kubenswrapper[4776]: I1208 09:08:34.473656 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1369cf70c305c53ced1fbff9834a5fddb305a0c271115911bb57dc7c6c9ae181"} err="failed to get container status \"1369cf70c305c53ced1fbff9834a5fddb305a0c271115911bb57dc7c6c9ae181\": rpc error: code = NotFound desc = could not find container \"1369cf70c305c53ced1fbff9834a5fddb305a0c271115911bb57dc7c6c9ae181\": container with ID starting with 1369cf70c305c53ced1fbff9834a5fddb305a0c271115911bb57dc7c6c9ae181 not found: ID does not exist" Dec 08 09:08:34 crc kubenswrapper[4776]: I1208 09:08:34.477246 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c6d8d7477-vqqb2"] Dec 08 09:08:36 crc kubenswrapper[4776]: I1208 09:08:36.351849 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e6675d3-9087-4401-a973-b3ffe5856e2f" path="/var/lib/kubelet/pods/8e6675d3-9087-4401-a973-b3ffe5856e2f/volumes" Dec 08 09:08:41 crc kubenswrapper[4776]: I1208 09:08:41.399767 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:08:41 crc kubenswrapper[4776]: I1208 09:08:41.400312 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:08:44 crc kubenswrapper[4776]: I1208 09:08:44.665457 4776 scope.go:117] "RemoveContainer" containerID="a74de1a7cfdcb0fe7a05e5b6882920042f87c165ea1352b894123e0aa72f9f84" Dec 08 09:08:44 crc kubenswrapper[4776]: I1208 09:08:44.687410 4776 scope.go:117] "RemoveContainer" containerID="3c3aef931e2786ce259ef07fb205acc6d62c8d53e708f7b3ff8d3a8e5fa0ca13" Dec 08 09:09:11 crc kubenswrapper[4776]: I1208 09:09:11.399400 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:09:11 crc kubenswrapper[4776]: I1208 09:09:11.400030 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:09:11 crc kubenswrapper[4776]: I1208 09:09:11.400084 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 09:09:11 crc kubenswrapper[4776]: I1208 09:09:11.400947 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"60dbb3e7c44241db89caa5cb2272dfcb89d62fdbf75c7153dbb476fd01b77752"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 09:09:11 crc kubenswrapper[4776]: I1208 09:09:11.401012 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://60dbb3e7c44241db89caa5cb2272dfcb89d62fdbf75c7153dbb476fd01b77752" gracePeriod=600 Dec 08 09:09:11 crc kubenswrapper[4776]: I1208 09:09:11.693815 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="60dbb3e7c44241db89caa5cb2272dfcb89d62fdbf75c7153dbb476fd01b77752" exitCode=0 Dec 08 09:09:11 crc kubenswrapper[4776]: I1208 09:09:11.693874 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"60dbb3e7c44241db89caa5cb2272dfcb89d62fdbf75c7153dbb476fd01b77752"} Dec 08 09:09:11 crc kubenswrapper[4776]: I1208 09:09:11.694318 4776 scope.go:117] "RemoveContainer" containerID="3686f95c2750ae2f6fecf0ef1b9e49c85b6866553ae81497ae9ec17dd913386b" Dec 08 09:09:12 crc kubenswrapper[4776]: I1208 09:09:12.702898 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"abac38e42f2fdbb7423dde9370109f19a92ff63c4313fd19999ad68bdb72ed2b"} Dec 08 09:11:11 crc kubenswrapper[4776]: I1208 09:11:11.399109 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:11:11 crc kubenswrapper[4776]: I1208 09:11:11.399753 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:11:25 crc kubenswrapper[4776]: I1208 09:11:25.881631 4776 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 08 09:11:37 crc kubenswrapper[4776]: I1208 09:11:37.703248 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5"] Dec 08 09:11:37 crc kubenswrapper[4776]: E1208 09:11:37.705516 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e6675d3-9087-4401-a973-b3ffe5856e2f" containerName="console" Dec 08 09:11:37 crc kubenswrapper[4776]: I1208 09:11:37.705669 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e6675d3-9087-4401-a973-b3ffe5856e2f" containerName="console" Dec 08 09:11:37 crc kubenswrapper[4776]: I1208 09:11:37.705965 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e6675d3-9087-4401-a973-b3ffe5856e2f" containerName="console" Dec 08 09:11:37 crc kubenswrapper[4776]: I1208 09:11:37.707585 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" Dec 08 09:11:37 crc kubenswrapper[4776]: I1208 09:11:37.711009 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 08 09:11:37 crc kubenswrapper[4776]: I1208 09:11:37.716662 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5"] Dec 08 09:11:37 crc kubenswrapper[4776]: I1208 09:11:37.759716 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5\" (UID: \"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" Dec 08 09:11:37 crc kubenswrapper[4776]: I1208 09:11:37.759784 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5\" (UID: \"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" Dec 08 09:11:37 crc kubenswrapper[4776]: I1208 09:11:37.759928 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gkxp\" (UniqueName: \"kubernetes.io/projected/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-kube-api-access-2gkxp\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5\" (UID: \"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" Dec 08 09:11:37 crc kubenswrapper[4776]: I1208 09:11:37.861619 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5\" (UID: \"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" Dec 08 09:11:37 crc kubenswrapper[4776]: I1208 09:11:37.861677 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5\" (UID: \"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" Dec 08 09:11:37 crc kubenswrapper[4776]: I1208 09:11:37.861723 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gkxp\" (UniqueName: \"kubernetes.io/projected/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-kube-api-access-2gkxp\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5\" (UID: \"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" Dec 08 09:11:37 crc kubenswrapper[4776]: I1208 09:11:37.862133 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5\" (UID: \"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" Dec 08 09:11:37 crc kubenswrapper[4776]: I1208 09:11:37.862344 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5\" (UID: \"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" Dec 08 09:11:37 crc kubenswrapper[4776]: I1208 09:11:37.896761 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gkxp\" (UniqueName: \"kubernetes.io/projected/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-kube-api-access-2gkxp\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5\" (UID: \"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" Dec 08 09:11:38 crc kubenswrapper[4776]: I1208 09:11:38.039407 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" Dec 08 09:11:38 crc kubenswrapper[4776]: I1208 09:11:38.532680 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5"] Dec 08 09:11:38 crc kubenswrapper[4776]: I1208 09:11:38.685830 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" event={"ID":"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b","Type":"ContainerStarted","Data":"8fefae77f5b4761e489363ae32e4c8c34d9fe2dc0b2408e0cf3943cf73899f14"} Dec 08 09:11:39 crc kubenswrapper[4776]: I1208 09:11:39.696697 4776 generic.go:334] "Generic (PLEG): container finished" podID="bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b" containerID="b453e8badd7d21a01b83df95dcfc2026361e49ccbb591095e1333d23aab510cc" exitCode=0 Dec 08 09:11:39 crc kubenswrapper[4776]: I1208 09:11:39.696774 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" event={"ID":"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b","Type":"ContainerDied","Data":"b453e8badd7d21a01b83df95dcfc2026361e49ccbb591095e1333d23aab510cc"} Dec 08 09:11:39 crc kubenswrapper[4776]: I1208 09:11:39.698894 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 09:11:40 crc kubenswrapper[4776]: I1208 09:11:40.021145 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wlz88"] Dec 08 09:11:40 crc kubenswrapper[4776]: I1208 09:11:40.022213 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:11:40 crc kubenswrapper[4776]: I1208 09:11:40.042988 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wlz88"] Dec 08 09:11:40 crc kubenswrapper[4776]: I1208 09:11:40.097211 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57476415-2f48-4b7d-824d-61fd5702c5d6-utilities\") pod \"redhat-operators-wlz88\" (UID: \"57476415-2f48-4b7d-824d-61fd5702c5d6\") " pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:11:40 crc kubenswrapper[4776]: I1208 09:11:40.097275 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9hzn\" (UniqueName: \"kubernetes.io/projected/57476415-2f48-4b7d-824d-61fd5702c5d6-kube-api-access-w9hzn\") pod \"redhat-operators-wlz88\" (UID: \"57476415-2f48-4b7d-824d-61fd5702c5d6\") " pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:11:40 crc kubenswrapper[4776]: I1208 09:11:40.097299 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57476415-2f48-4b7d-824d-61fd5702c5d6-catalog-content\") pod \"redhat-operators-wlz88\" (UID: \"57476415-2f48-4b7d-824d-61fd5702c5d6\") " pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:11:40 crc kubenswrapper[4776]: I1208 09:11:40.199440 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9hzn\" (UniqueName: \"kubernetes.io/projected/57476415-2f48-4b7d-824d-61fd5702c5d6-kube-api-access-w9hzn\") pod \"redhat-operators-wlz88\" (UID: \"57476415-2f48-4b7d-824d-61fd5702c5d6\") " pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:11:40 crc kubenswrapper[4776]: I1208 09:11:40.199502 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57476415-2f48-4b7d-824d-61fd5702c5d6-catalog-content\") pod \"redhat-operators-wlz88\" (UID: \"57476415-2f48-4b7d-824d-61fd5702c5d6\") " pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:11:40 crc kubenswrapper[4776]: I1208 09:11:40.199624 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57476415-2f48-4b7d-824d-61fd5702c5d6-utilities\") pod \"redhat-operators-wlz88\" (UID: \"57476415-2f48-4b7d-824d-61fd5702c5d6\") " pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:11:40 crc kubenswrapper[4776]: I1208 09:11:40.200073 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57476415-2f48-4b7d-824d-61fd5702c5d6-catalog-content\") pod \"redhat-operators-wlz88\" (UID: \"57476415-2f48-4b7d-824d-61fd5702c5d6\") " pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:11:40 crc kubenswrapper[4776]: I1208 09:11:40.200273 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57476415-2f48-4b7d-824d-61fd5702c5d6-utilities\") pod \"redhat-operators-wlz88\" (UID: \"57476415-2f48-4b7d-824d-61fd5702c5d6\") " pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:11:40 crc kubenswrapper[4776]: I1208 09:11:40.234251 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9hzn\" (UniqueName: \"kubernetes.io/projected/57476415-2f48-4b7d-824d-61fd5702c5d6-kube-api-access-w9hzn\") pod \"redhat-operators-wlz88\" (UID: \"57476415-2f48-4b7d-824d-61fd5702c5d6\") " pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:11:40 crc kubenswrapper[4776]: I1208 09:11:40.336639 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:11:40 crc kubenswrapper[4776]: I1208 09:11:40.613727 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wlz88"] Dec 08 09:11:40 crc kubenswrapper[4776]: W1208 09:11:40.627377 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57476415_2f48_4b7d_824d_61fd5702c5d6.slice/crio-54d1b2b8e74c11fdfeb361f40ad082b446eb8a9b8d8dd20036c96360664df741 WatchSource:0}: Error finding container 54d1b2b8e74c11fdfeb361f40ad082b446eb8a9b8d8dd20036c96360664df741: Status 404 returned error can't find the container with id 54d1b2b8e74c11fdfeb361f40ad082b446eb8a9b8d8dd20036c96360664df741 Dec 08 09:11:40 crc kubenswrapper[4776]: I1208 09:11:40.706616 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlz88" event={"ID":"57476415-2f48-4b7d-824d-61fd5702c5d6","Type":"ContainerStarted","Data":"54d1b2b8e74c11fdfeb361f40ad082b446eb8a9b8d8dd20036c96360664df741"} Dec 08 09:11:41 crc kubenswrapper[4776]: I1208 09:11:41.398565 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:11:41 crc kubenswrapper[4776]: I1208 09:11:41.398940 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:11:41 crc kubenswrapper[4776]: I1208 09:11:41.715205 4776 generic.go:334] "Generic (PLEG): container finished" podID="bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b" containerID="47a9a2ff976476f2def63f5504880cbb4f4d18d0c039e227e17b4007d5a30367" exitCode=0 Dec 08 09:11:41 crc kubenswrapper[4776]: I1208 09:11:41.715355 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" event={"ID":"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b","Type":"ContainerDied","Data":"47a9a2ff976476f2def63f5504880cbb4f4d18d0c039e227e17b4007d5a30367"} Dec 08 09:11:41 crc kubenswrapper[4776]: I1208 09:11:41.717463 4776 generic.go:334] "Generic (PLEG): container finished" podID="57476415-2f48-4b7d-824d-61fd5702c5d6" containerID="c9a46a23dcf7ff8d0c458e79088c78de3b08576b9936d57e453d2414467d20ba" exitCode=0 Dec 08 09:11:41 crc kubenswrapper[4776]: I1208 09:11:41.717512 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlz88" event={"ID":"57476415-2f48-4b7d-824d-61fd5702c5d6","Type":"ContainerDied","Data":"c9a46a23dcf7ff8d0c458e79088c78de3b08576b9936d57e453d2414467d20ba"} Dec 08 09:11:42 crc kubenswrapper[4776]: I1208 09:11:42.729710 4776 generic.go:334] "Generic (PLEG): container finished" podID="bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b" containerID="603b8dba616d948d454cd2a70d6979832faa9d7621dc0e407852273a8a107308" exitCode=0 Dec 08 09:11:42 crc kubenswrapper[4776]: I1208 09:11:42.729912 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" event={"ID":"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b","Type":"ContainerDied","Data":"603b8dba616d948d454cd2a70d6979832faa9d7621dc0e407852273a8a107308"} Dec 08 09:11:42 crc kubenswrapper[4776]: I1208 09:11:42.738153 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlz88" event={"ID":"57476415-2f48-4b7d-824d-61fd5702c5d6","Type":"ContainerStarted","Data":"9dbc6298523724cb1b6943c6aa71c9c8b34e3a73b28e400846c7d8c15cf42512"} Dec 08 09:11:44 crc kubenswrapper[4776]: I1208 09:11:44.275478 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" Dec 08 09:11:44 crc kubenswrapper[4776]: I1208 09:11:44.365678 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-bundle\") pod \"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b\" (UID: \"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b\") " Dec 08 09:11:44 crc kubenswrapper[4776]: I1208 09:11:44.365794 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gkxp\" (UniqueName: \"kubernetes.io/projected/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-kube-api-access-2gkxp\") pod \"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b\" (UID: \"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b\") " Dec 08 09:11:44 crc kubenswrapper[4776]: I1208 09:11:44.365897 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-util\") pod \"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b\" (UID: \"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b\") " Dec 08 09:11:44 crc kubenswrapper[4776]: I1208 09:11:44.367984 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-bundle" (OuterVolumeSpecName: "bundle") pod "bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b" (UID: "bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:11:44 crc kubenswrapper[4776]: I1208 09:11:44.378385 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-kube-api-access-2gkxp" (OuterVolumeSpecName: "kube-api-access-2gkxp") pod "bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b" (UID: "bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b"). InnerVolumeSpecName "kube-api-access-2gkxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:11:44 crc kubenswrapper[4776]: I1208 09:11:44.410113 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-util" (OuterVolumeSpecName: "util") pod "bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b" (UID: "bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:11:44 crc kubenswrapper[4776]: I1208 09:11:44.467650 4776 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-util\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:44 crc kubenswrapper[4776]: I1208 09:11:44.467689 4776 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:44 crc kubenswrapper[4776]: I1208 09:11:44.467699 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gkxp\" (UniqueName: \"kubernetes.io/projected/bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b-kube-api-access-2gkxp\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:44 crc kubenswrapper[4776]: I1208 09:11:44.751449 4776 generic.go:334] "Generic (PLEG): container finished" podID="57476415-2f48-4b7d-824d-61fd5702c5d6" containerID="9dbc6298523724cb1b6943c6aa71c9c8b34e3a73b28e400846c7d8c15cf42512" exitCode=0 Dec 08 09:11:44 crc kubenswrapper[4776]: I1208 09:11:44.751563 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlz88" event={"ID":"57476415-2f48-4b7d-824d-61fd5702c5d6","Type":"ContainerDied","Data":"9dbc6298523724cb1b6943c6aa71c9c8b34e3a73b28e400846c7d8c15cf42512"} Dec 08 09:11:44 crc kubenswrapper[4776]: I1208 09:11:44.757358 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" Dec 08 09:11:44 crc kubenswrapper[4776]: I1208 09:11:44.757457 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5" event={"ID":"bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b","Type":"ContainerDied","Data":"8fefae77f5b4761e489363ae32e4c8c34d9fe2dc0b2408e0cf3943cf73899f14"} Dec 08 09:11:44 crc kubenswrapper[4776]: I1208 09:11:44.757504 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fefae77f5b4761e489363ae32e4c8c34d9fe2dc0b2408e0cf3943cf73899f14" Dec 08 09:11:45 crc kubenswrapper[4776]: I1208 09:11:45.765259 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlz88" event={"ID":"57476415-2f48-4b7d-824d-61fd5702c5d6","Type":"ContainerStarted","Data":"e0718be14ed2fa3a7747a8348ffe2a6f8a60cbc25874ce3b8836bc4820347fb3"} Dec 08 09:11:45 crc kubenswrapper[4776]: I1208 09:11:45.788582 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wlz88" podStartSLOduration=2.354779065 podStartE2EDuration="5.788563232s" podCreationTimestamp="2025-12-08 09:11:40 +0000 UTC" firstStartedPulling="2025-12-08 09:11:41.718686928 +0000 UTC m=+777.981911950" lastFinishedPulling="2025-12-08 09:11:45.152471085 +0000 UTC m=+781.415696117" observedRunningTime="2025-12-08 09:11:45.786150946 +0000 UTC m=+782.049375978" watchObservedRunningTime="2025-12-08 09:11:45.788563232 +0000 UTC m=+782.051788254" Dec 08 09:11:48 crc kubenswrapper[4776]: I1208 09:11:48.685164 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-swbsc"] Dec 08 09:11:48 crc kubenswrapper[4776]: I1208 09:11:48.685996 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="nbdb" containerID="cri-o://a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b" gracePeriod=30 Dec 08 09:11:48 crc kubenswrapper[4776]: I1208 09:11:48.686128 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="sbdb" containerID="cri-o://e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca" gracePeriod=30 Dec 08 09:11:48 crc kubenswrapper[4776]: I1208 09:11:48.686107 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55" gracePeriod=30 Dec 08 09:11:48 crc kubenswrapper[4776]: I1208 09:11:48.686214 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovn-acl-logging" containerID="cri-o://9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc" gracePeriod=30 Dec 08 09:11:48 crc kubenswrapper[4776]: I1208 09:11:48.686204 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="kube-rbac-proxy-node" containerID="cri-o://0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b" gracePeriod=30 Dec 08 09:11:48 crc kubenswrapper[4776]: I1208 09:11:48.685957 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovn-controller" containerID="cri-o://9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2" gracePeriod=30 Dec 08 09:11:48 crc kubenswrapper[4776]: I1208 09:11:48.686160 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="northd" containerID="cri-o://3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab" gracePeriod=30 Dec 08 09:11:48 crc kubenswrapper[4776]: I1208 09:11:48.745387 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovnkube-controller" containerID="cri-o://e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0" gracePeriod=30 Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.800990 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovnkube-controller/3.log" Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.803825 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovn-acl-logging/0.log" Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.804239 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovn-controller/0.log" Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.804579 4776 generic.go:334] "Generic (PLEG): container finished" podID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerID="e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0" exitCode=0 Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.804607 4776 generic.go:334] "Generic (PLEG): container finished" podID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerID="e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca" exitCode=0 Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.804618 4776 generic.go:334] "Generic (PLEG): container finished" podID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerID="a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b" exitCode=0 Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.804627 4776 generic.go:334] "Generic (PLEG): container finished" podID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerID="3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab" exitCode=0 Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.804636 4776 generic.go:334] "Generic (PLEG): container finished" podID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerID="9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc" exitCode=143 Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.804646 4776 generic.go:334] "Generic (PLEG): container finished" podID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerID="9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2" exitCode=143 Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.804703 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerDied","Data":"e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0"} Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.804740 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerDied","Data":"e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca"} Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.804754 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerDied","Data":"a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b"} Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.804810 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerDied","Data":"3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab"} Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.804823 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerDied","Data":"9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc"} Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.804835 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerDied","Data":"9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2"} Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.804853 4776 scope.go:117] "RemoveContainer" containerID="b728069c5c670cfef1888e64d211dfcfefb2de8c9ea9cf0a346c4538578b557e" Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.811967 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-555j6_775b9e97-3ad5-4003-a2c2-fc8dd58b69cc/kube-multus/2.log" Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.812430 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-555j6_775b9e97-3ad5-4003-a2c2-fc8dd58b69cc/kube-multus/1.log" Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.812467 4776 generic.go:334] "Generic (PLEG): container finished" podID="775b9e97-3ad5-4003-a2c2-fc8dd58b69cc" containerID="69d2876b5cbb01bb020eec751d903bc19a2687f73ca0e18de2aaf643d15143d7" exitCode=2 Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.812499 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-555j6" event={"ID":"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc","Type":"ContainerDied","Data":"69d2876b5cbb01bb020eec751d903bc19a2687f73ca0e18de2aaf643d15143d7"} Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.812979 4776 scope.go:117] "RemoveContainer" containerID="69d2876b5cbb01bb020eec751d903bc19a2687f73ca0e18de2aaf643d15143d7" Dec 08 09:11:49 crc kubenswrapper[4776]: I1208 09:11:49.868469 4776 scope.go:117] "RemoveContainer" containerID="bf6eb111dbbc6dec73baadf4e88ff08f03050658b7682b28c960ecdb80973eae" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.065223 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovn-acl-logging/0.log" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.065689 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovn-controller/0.log" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.066238 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.146572 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tgwm2"] Dec 08 09:11:50 crc kubenswrapper[4776]: E1208 09:11:50.146857 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovn-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.146873 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovn-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: E1208 09:11:50.146883 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovnkube-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.146890 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovnkube-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: E1208 09:11:50.146899 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="kube-rbac-proxy-ovn-metrics" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.146907 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="kube-rbac-proxy-ovn-metrics" Dec 08 09:11:50 crc kubenswrapper[4776]: E1208 09:11:50.146914 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b" containerName="extract" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.146921 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b" containerName="extract" Dec 08 09:11:50 crc kubenswrapper[4776]: E1208 09:11:50.146934 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b" containerName="util" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.146940 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b" containerName="util" Dec 08 09:11:50 crc kubenswrapper[4776]: E1208 09:11:50.146950 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="kubecfg-setup" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.146956 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="kubecfg-setup" Dec 08 09:11:50 crc kubenswrapper[4776]: E1208 09:11:50.146963 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="northd" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.146969 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="northd" Dec 08 09:11:50 crc kubenswrapper[4776]: E1208 09:11:50.146977 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="nbdb" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.146983 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="nbdb" Dec 08 09:11:50 crc kubenswrapper[4776]: E1208 09:11:50.146991 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovnkube-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.146997 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovnkube-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: E1208 09:11:50.147004 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovn-acl-logging" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147010 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovn-acl-logging" Dec 08 09:11:50 crc kubenswrapper[4776]: E1208 09:11:50.147024 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="kube-rbac-proxy-node" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147029 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="kube-rbac-proxy-node" Dec 08 09:11:50 crc kubenswrapper[4776]: E1208 09:11:50.147039 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b" containerName="pull" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147045 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b" containerName="pull" Dec 08 09:11:50 crc kubenswrapper[4776]: E1208 09:11:50.147053 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="sbdb" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147059 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="sbdb" Dec 08 09:11:50 crc kubenswrapper[4776]: E1208 09:11:50.147074 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovnkube-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147081 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovnkube-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147219 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="kube-rbac-proxy-node" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147230 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="kube-rbac-proxy-ovn-metrics" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147237 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovnkube-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147247 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b" containerName="extract" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147254 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovnkube-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147263 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="sbdb" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147269 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovnkube-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147277 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="northd" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147285 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="nbdb" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147294 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovn-acl-logging" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147300 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovn-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: E1208 09:11:50.147403 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovnkube-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147410 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovnkube-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: E1208 09:11:50.147420 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovnkube-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147426 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovnkube-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147536 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovnkube-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.147718 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerName="ovnkube-controller" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.149334 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.213611 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-env-overrides\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.213654 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-systemd-units\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.213674 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-cni-netd\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.213688 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-node-log\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.213704 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-systemd\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.213720 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-etc-openvswitch\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.213743 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-ovn\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.213774 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-log-socket\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.213810 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovnkube-script-lib\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.213832 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-slash\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.213874 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xks4z\" (UniqueName: \"kubernetes.io/projected/1e518469-5b3b-4055-a0f0-075dc48b1c79-kube-api-access-xks4z\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.213889 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovnkube-config\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.213910 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-run-netns\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.213935 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-kubelet\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.213963 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-openvswitch\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.213984 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-run-ovn-kubernetes\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.214013 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovn-node-metrics-cert\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.214034 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-cni-bin\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.214056 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-var-lib-openvswitch\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.214075 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-var-lib-cni-networks-ovn-kubernetes\") pod \"1e518469-5b3b-4055-a0f0-075dc48b1c79\" (UID: \"1e518469-5b3b-4055-a0f0-075dc48b1c79\") " Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.214364 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.214703 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.214736 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.214777 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.214794 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-node-log" (OuterVolumeSpecName: "node-log") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.215506 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.215589 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-log-socket" (OuterVolumeSpecName: "log-socket") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.215706 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.215624 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-slash" (OuterVolumeSpecName: "host-slash") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.215661 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.215652 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.215682 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.215652 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.215703 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.215950 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.216083 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.216093 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.221281 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.226502 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e518469-5b3b-4055-a0f0-075dc48b1c79-kube-api-access-xks4z" (OuterVolumeSpecName: "kube-api-access-xks4z") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "kube-api-access-xks4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.229929 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "1e518469-5b3b-4055-a0f0-075dc48b1c79" (UID: "1e518469-5b3b-4055-a0f0-075dc48b1c79"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.315475 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-run-systemd\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.315536 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-ovnkube-config\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.315562 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.315591 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.315623 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-ovnkube-script-lib\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.315651 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-etc-openvswitch\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.315724 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-cni-bin\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.315781 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-log-socket\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.315828 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-systemd-units\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.315845 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-run-openvswitch\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.315860 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-var-lib-openvswitch\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.315901 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-ovn-node-metrics-cert\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.315939 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-run-ovn\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.315960 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-cni-netd\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.315974 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-node-log\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.315992 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-kubelet\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316011 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-run-netns\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316056 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlq9m\" (UniqueName: \"kubernetes.io/projected/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-kube-api-access-hlq9m\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316072 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-env-overrides\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316087 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-slash\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316182 4776 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316196 4776 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316205 4776 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316222 4776 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316232 4776 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316241 4776 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316250 4776 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316258 4776 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316267 4776 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-node-log\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316275 4776 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316283 4776 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316291 4776 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316299 4776 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-log-socket\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316308 4776 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316316 4776 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-slash\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316324 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xks4z\" (UniqueName: \"kubernetes.io/projected/1e518469-5b3b-4055-a0f0-075dc48b1c79-kube-api-access-xks4z\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316333 4776 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e518469-5b3b-4055-a0f0-075dc48b1c79-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316341 4776 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316349 4776 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.316360 4776 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e518469-5b3b-4055-a0f0-075dc48b1c79-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.337573 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.337893 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441576 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlq9m\" (UniqueName: \"kubernetes.io/projected/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-kube-api-access-hlq9m\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441612 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-env-overrides\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441632 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-slash\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441657 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-run-systemd\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441677 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-ovnkube-config\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441693 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441710 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441731 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-ovnkube-script-lib\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441748 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-etc-openvswitch\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441768 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-cni-bin\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441788 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-log-socket\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441812 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-systemd-units\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441827 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-run-openvswitch\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441842 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-var-lib-openvswitch\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441873 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-ovn-node-metrics-cert\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441896 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-run-ovn\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441913 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-cni-netd\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441926 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-node-log\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441944 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-kubelet\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.441960 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-run-netns\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.442027 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-run-netns\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.442869 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-env-overrides\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.442910 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-slash\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.442931 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-run-systemd\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.443258 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-run-openvswitch\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.443332 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.443345 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-ovnkube-config\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.443362 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.443380 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-var-lib-openvswitch\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.443911 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-ovnkube-script-lib\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.443956 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-etc-openvswitch\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.443980 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-cni-bin\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.444002 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-log-socket\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.444032 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-systemd-units\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.444058 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-cni-netd\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.444081 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-run-ovn\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.444104 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-node-log\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.444123 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-host-kubelet\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.446801 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-ovn-node-metrics-cert\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.478385 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlq9m\" (UniqueName: \"kubernetes.io/projected/e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5-kube-api-access-hlq9m\") pod \"ovnkube-node-tgwm2\" (UID: \"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.763212 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:50 crc kubenswrapper[4776]: W1208 09:11:50.785902 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7d51fc9_4de6_49d6_ad0a_dd91c6f914e5.slice/crio-a89323708f7a6b5f926f4cec28305d4a95368b72f362c89901d3094b824a5bb8 WatchSource:0}: Error finding container a89323708f7a6b5f926f4cec28305d4a95368b72f362c89901d3094b824a5bb8: Status 404 returned error can't find the container with id a89323708f7a6b5f926f4cec28305d4a95368b72f362c89901d3094b824a5bb8 Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.834003 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-555j6_775b9e97-3ad5-4003-a2c2-fc8dd58b69cc/kube-multus/2.log" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.834088 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-555j6" event={"ID":"775b9e97-3ad5-4003-a2c2-fc8dd58b69cc","Type":"ContainerStarted","Data":"cd32d37d6a6d0629c7714e7968af377203f360e1b7c81c61af06bcbf318da845"} Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.837480 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" event={"ID":"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5","Type":"ContainerStarted","Data":"a89323708f7a6b5f926f4cec28305d4a95368b72f362c89901d3094b824a5bb8"} Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.841897 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovn-acl-logging/0.log" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.842422 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swbsc_1e518469-5b3b-4055-a0f0-075dc48b1c79/ovn-controller/0.log" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.842783 4776 generic.go:334] "Generic (PLEG): container finished" podID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerID="712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55" exitCode=0 Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.842808 4776 generic.go:334] "Generic (PLEG): container finished" podID="1e518469-5b3b-4055-a0f0-075dc48b1c79" containerID="0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b" exitCode=0 Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.842833 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerDied","Data":"712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55"} Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.842873 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerDied","Data":"0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b"} Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.842884 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" event={"ID":"1e518469-5b3b-4055-a0f0-075dc48b1c79","Type":"ContainerDied","Data":"500f23bea6efe33ec35e970e14b4348a4da597e5b10327f258814e19eb122b2b"} Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.842895 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-swbsc" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.842903 4776 scope.go:117] "RemoveContainer" containerID="e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.889556 4776 scope.go:117] "RemoveContainer" containerID="e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.901209 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-swbsc"] Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.907706 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-swbsc"] Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.923438 4776 scope.go:117] "RemoveContainer" containerID="a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.939348 4776 scope.go:117] "RemoveContainer" containerID="3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.960766 4776 scope.go:117] "RemoveContainer" containerID="712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55" Dec 08 09:11:50 crc kubenswrapper[4776]: I1208 09:11:50.978677 4776 scope.go:117] "RemoveContainer" containerID="0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.025328 4776 scope.go:117] "RemoveContainer" containerID="9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.043986 4776 scope.go:117] "RemoveContainer" containerID="9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.071310 4776 scope.go:117] "RemoveContainer" containerID="1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.097383 4776 scope.go:117] "RemoveContainer" containerID="e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0" Dec 08 09:11:51 crc kubenswrapper[4776]: E1208 09:11:51.098515 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0\": container with ID starting with e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0 not found: ID does not exist" containerID="e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.098553 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0"} err="failed to get container status \"e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0\": rpc error: code = NotFound desc = could not find container \"e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0\": container with ID starting with e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0 not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.098596 4776 scope.go:117] "RemoveContainer" containerID="e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca" Dec 08 09:11:51 crc kubenswrapper[4776]: E1208 09:11:51.099022 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\": container with ID starting with e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca not found: ID does not exist" containerID="e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.099105 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca"} err="failed to get container status \"e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\": rpc error: code = NotFound desc = could not find container \"e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\": container with ID starting with e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.099189 4776 scope.go:117] "RemoveContainer" containerID="a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b" Dec 08 09:11:51 crc kubenswrapper[4776]: E1208 09:11:51.099654 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\": container with ID starting with a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b not found: ID does not exist" containerID="a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.099695 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b"} err="failed to get container status \"a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\": rpc error: code = NotFound desc = could not find container \"a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\": container with ID starting with a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.099709 4776 scope.go:117] "RemoveContainer" containerID="3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab" Dec 08 09:11:51 crc kubenswrapper[4776]: E1208 09:11:51.099961 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\": container with ID starting with 3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab not found: ID does not exist" containerID="3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.100000 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab"} err="failed to get container status \"3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\": rpc error: code = NotFound desc = could not find container \"3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\": container with ID starting with 3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.100027 4776 scope.go:117] "RemoveContainer" containerID="712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55" Dec 08 09:11:51 crc kubenswrapper[4776]: E1208 09:11:51.100687 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\": container with ID starting with 712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55 not found: ID does not exist" containerID="712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.100727 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55"} err="failed to get container status \"712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\": rpc error: code = NotFound desc = could not find container \"712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\": container with ID starting with 712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55 not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.100747 4776 scope.go:117] "RemoveContainer" containerID="0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b" Dec 08 09:11:51 crc kubenswrapper[4776]: E1208 09:11:51.100984 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\": container with ID starting with 0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b not found: ID does not exist" containerID="0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.101056 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b"} err="failed to get container status \"0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\": rpc error: code = NotFound desc = could not find container \"0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\": container with ID starting with 0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.101122 4776 scope.go:117] "RemoveContainer" containerID="9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc" Dec 08 09:11:51 crc kubenswrapper[4776]: E1208 09:11:51.101418 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\": container with ID starting with 9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc not found: ID does not exist" containerID="9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.101494 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc"} err="failed to get container status \"9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\": rpc error: code = NotFound desc = could not find container \"9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\": container with ID starting with 9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.101554 4776 scope.go:117] "RemoveContainer" containerID="9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2" Dec 08 09:11:51 crc kubenswrapper[4776]: E1208 09:11:51.101978 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\": container with ID starting with 9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2 not found: ID does not exist" containerID="9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.102061 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2"} err="failed to get container status \"9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\": rpc error: code = NotFound desc = could not find container \"9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\": container with ID starting with 9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2 not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.102123 4776 scope.go:117] "RemoveContainer" containerID="1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e" Dec 08 09:11:51 crc kubenswrapper[4776]: E1208 09:11:51.102552 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\": container with ID starting with 1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e not found: ID does not exist" containerID="1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.102583 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e"} err="failed to get container status \"1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\": rpc error: code = NotFound desc = could not find container \"1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\": container with ID starting with 1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.102598 4776 scope.go:117] "RemoveContainer" containerID="e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.102883 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0"} err="failed to get container status \"e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0\": rpc error: code = NotFound desc = could not find container \"e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0\": container with ID starting with e8a5e6b5f6ff41d95ccfe47343f422e5875677f90ac071f2d68d251e92a234b0 not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.102901 4776 scope.go:117] "RemoveContainer" containerID="e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.103145 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca"} err="failed to get container status \"e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\": rpc error: code = NotFound desc = could not find container \"e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca\": container with ID starting with e8d4d85ae7a43d2800c11a9bfdc7f1c827bdb6628891b9a0c29140bf869307ca not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.103246 4776 scope.go:117] "RemoveContainer" containerID="a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.106179 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b"} err="failed to get container status \"a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\": rpc error: code = NotFound desc = could not find container \"a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b\": container with ID starting with a95beb9eadee827797268eb99d707a8782239962485a6f258adea6f3ee994c8b not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.106273 4776 scope.go:117] "RemoveContainer" containerID="3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.106585 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab"} err="failed to get container status \"3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\": rpc error: code = NotFound desc = could not find container \"3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab\": container with ID starting with 3cf78bd635e2922e1ade70a5fb24aebf3845f59d6aae829d8cba7c1c44da75ab not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.106607 4776 scope.go:117] "RemoveContainer" containerID="712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.106926 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55"} err="failed to get container status \"712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\": rpc error: code = NotFound desc = could not find container \"712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55\": container with ID starting with 712accfff1a4a51efc4d3d2a90e3e2d6c47eb04f5cd3d72b800e0778f0963b55 not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.106962 4776 scope.go:117] "RemoveContainer" containerID="0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.107306 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b"} err="failed to get container status \"0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\": rpc error: code = NotFound desc = could not find container \"0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b\": container with ID starting with 0f688c89d388f271d5313cac8f933d6e43da463b2d6da546151a2bbff7350f8b not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.107326 4776 scope.go:117] "RemoveContainer" containerID="9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.107599 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc"} err="failed to get container status \"9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\": rpc error: code = NotFound desc = could not find container \"9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc\": container with ID starting with 9afb59149e1f7b1462fc1ae949150d5c137905504ba2366db083abd13b4db6fc not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.107666 4776 scope.go:117] "RemoveContainer" containerID="9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.107975 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2"} err="failed to get container status \"9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\": rpc error: code = NotFound desc = could not find container \"9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2\": container with ID starting with 9e06c373c06904c837e3c6b4b8d26d2b2e34aa17db8079de76e1e33594d284b2 not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.107998 4776 scope.go:117] "RemoveContainer" containerID="1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.108282 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e"} err="failed to get container status \"1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\": rpc error: code = NotFound desc = could not find container \"1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e\": container with ID starting with 1712096eaf2c22101bd0b8235bc2caf40e2f3ab6defba292b3c82fafd028387e not found: ID does not exist" Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.432359 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wlz88" podUID="57476415-2f48-4b7d-824d-61fd5702c5d6" containerName="registry-server" probeResult="failure" output=< Dec 08 09:11:51 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 09:11:51 crc kubenswrapper[4776]: > Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.854642 4776 generic.go:334] "Generic (PLEG): container finished" podID="e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5" containerID="f936bdf38dec3696b45d3f993ebf07a5d183e1500830542859dc4daad69a89c3" exitCode=0 Dec 08 09:11:51 crc kubenswrapper[4776]: I1208 09:11:51.854679 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" event={"ID":"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5","Type":"ContainerDied","Data":"f936bdf38dec3696b45d3f993ebf07a5d183e1500830542859dc4daad69a89c3"} Dec 08 09:11:52 crc kubenswrapper[4776]: I1208 09:11:52.350877 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e518469-5b3b-4055-a0f0-075dc48b1c79" path="/var/lib/kubelet/pods/1e518469-5b3b-4055-a0f0-075dc48b1c79/volumes" Dec 08 09:11:52 crc kubenswrapper[4776]: I1208 09:11:52.862836 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" event={"ID":"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5","Type":"ContainerStarted","Data":"ae6f84535d961366ff3ce9b577f459a5c906f789a26dee55609428d30e99e67c"} Dec 08 09:11:52 crc kubenswrapper[4776]: I1208 09:11:52.862881 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" event={"ID":"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5","Type":"ContainerStarted","Data":"77914d50c23fc13a4126e350e69de5fc62819cbecf4067d6bb253168e8d10c00"} Dec 08 09:11:52 crc kubenswrapper[4776]: I1208 09:11:52.862893 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" event={"ID":"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5","Type":"ContainerStarted","Data":"63214bdf8e68df0d70a02907028060e49f2bab3ce800e417f0e3855310096804"} Dec 08 09:11:52 crc kubenswrapper[4776]: I1208 09:11:52.862904 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" event={"ID":"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5","Type":"ContainerStarted","Data":"8838b7cf87f4fd9fd1a32b0f375e1d100d0d4543b0550e5ca9521f8c0ddc3077"} Dec 08 09:11:52 crc kubenswrapper[4776]: I1208 09:11:52.862913 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" event={"ID":"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5","Type":"ContainerStarted","Data":"b24a300c13cfd1507466fd9fd13c8cbef757f49a22fa085b6f242815c0a765a2"} Dec 08 09:11:53 crc kubenswrapper[4776]: I1208 09:11:53.874353 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" event={"ID":"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5","Type":"ContainerStarted","Data":"e4eb62c6dc5de947dbe95ded969c03fc612c926880be0a8bca577033816e9d8d"} Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.321724 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v"] Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.322915 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.324556 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.324575 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.337727 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-nls4q" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.392851 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vn4r\" (UniqueName: \"kubernetes.io/projected/3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8-kube-api-access-8vn4r\") pod \"obo-prometheus-operator-668cf9dfbb-4r72v\" (UID: \"3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.445160 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv"] Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.445821 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.447473 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.447632 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-f5blj" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.457393 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75"] Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.461431 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.494088 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff9db296-6f02-44bf-810c-48cfb090036e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7994656576-d6jvv\" (UID: \"ff9db296-6f02-44bf-810c-48cfb090036e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.494360 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/968acbdd-ab1d-4aa4-9db9-654170c5fa2d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7994656576-gzq75\" (UID: \"968acbdd-ab1d-4aa4-9db9-654170c5fa2d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.494543 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vn4r\" (UniqueName: \"kubernetes.io/projected/3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8-kube-api-access-8vn4r\") pod \"obo-prometheus-operator-668cf9dfbb-4r72v\" (UID: \"3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.494699 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff9db296-6f02-44bf-810c-48cfb090036e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7994656576-d6jvv\" (UID: \"ff9db296-6f02-44bf-810c-48cfb090036e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.494847 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/968acbdd-ab1d-4aa4-9db9-654170c5fa2d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7994656576-gzq75\" (UID: \"968acbdd-ab1d-4aa4-9db9-654170c5fa2d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.517016 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vn4r\" (UniqueName: \"kubernetes.io/projected/3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8-kube-api-access-8vn4r\") pod \"obo-prometheus-operator-668cf9dfbb-4r72v\" (UID: \"3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.596740 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff9db296-6f02-44bf-810c-48cfb090036e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7994656576-d6jvv\" (UID: \"ff9db296-6f02-44bf-810c-48cfb090036e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.596809 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/968acbdd-ab1d-4aa4-9db9-654170c5fa2d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7994656576-gzq75\" (UID: \"968acbdd-ab1d-4aa4-9db9-654170c5fa2d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.596861 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff9db296-6f02-44bf-810c-48cfb090036e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7994656576-d6jvv\" (UID: \"ff9db296-6f02-44bf-810c-48cfb090036e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.596905 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/968acbdd-ab1d-4aa4-9db9-654170c5fa2d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7994656576-gzq75\" (UID: \"968acbdd-ab1d-4aa4-9db9-654170c5fa2d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.600969 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff9db296-6f02-44bf-810c-48cfb090036e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7994656576-d6jvv\" (UID: \"ff9db296-6f02-44bf-810c-48cfb090036e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.601296 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff9db296-6f02-44bf-810c-48cfb090036e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7994656576-d6jvv\" (UID: \"ff9db296-6f02-44bf-810c-48cfb090036e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.601439 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/968acbdd-ab1d-4aa4-9db9-654170c5fa2d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7994656576-gzq75\" (UID: \"968acbdd-ab1d-4aa4-9db9-654170c5fa2d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.601809 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/968acbdd-ab1d-4aa4-9db9-654170c5fa2d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7994656576-gzq75\" (UID: \"968acbdd-ab1d-4aa4-9db9-654170c5fa2d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.637687 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" Dec 08 09:11:55 crc kubenswrapper[4776]: E1208 09:11:55.666404 4776 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-4r72v_openshift-operators_3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8_0(07e762f67b784c9843712af3a769fb95c05d7066d86adf3f0e04d2ea6b5f3509): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 08 09:11:55 crc kubenswrapper[4776]: E1208 09:11:55.666471 4776 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-4r72v_openshift-operators_3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8_0(07e762f67b784c9843712af3a769fb95c05d7066d86adf3f0e04d2ea6b5f3509): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" Dec 08 09:11:55 crc kubenswrapper[4776]: E1208 09:11:55.666499 4776 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-4r72v_openshift-operators_3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8_0(07e762f67b784c9843712af3a769fb95c05d7066d86adf3f0e04d2ea6b5f3509): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" Dec 08 09:11:55 crc kubenswrapper[4776]: E1208 09:11:55.666553 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-4r72v_openshift-operators(3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-4r72v_openshift-operators(3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-4r72v_openshift-operators_3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8_0(07e762f67b784c9843712af3a769fb95c05d7066d86adf3f0e04d2ea6b5f3509): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" podUID="3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.770880 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" Dec 08 09:11:55 crc kubenswrapper[4776]: I1208 09:11:55.786430 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" Dec 08 09:11:55 crc kubenswrapper[4776]: E1208 09:11:55.798266 4776 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7994656576-d6jvv_openshift-operators_ff9db296-6f02-44bf-810c-48cfb090036e_0(502bbe5befb20fee0095933b639e681512e10d1e7a92f69700af569f550b19d5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 08 09:11:55 crc kubenswrapper[4776]: E1208 09:11:55.798331 4776 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7994656576-d6jvv_openshift-operators_ff9db296-6f02-44bf-810c-48cfb090036e_0(502bbe5befb20fee0095933b639e681512e10d1e7a92f69700af569f550b19d5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" Dec 08 09:11:55 crc kubenswrapper[4776]: E1208 09:11:55.798353 4776 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7994656576-d6jvv_openshift-operators_ff9db296-6f02-44bf-810c-48cfb090036e_0(502bbe5befb20fee0095933b639e681512e10d1e7a92f69700af569f550b19d5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" Dec 08 09:11:55 crc kubenswrapper[4776]: E1208 09:11:55.798396 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7994656576-d6jvv_openshift-operators(ff9db296-6f02-44bf-810c-48cfb090036e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7994656576-d6jvv_openshift-operators(ff9db296-6f02-44bf-810c-48cfb090036e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7994656576-d6jvv_openshift-operators_ff9db296-6f02-44bf-810c-48cfb090036e_0(502bbe5befb20fee0095933b639e681512e10d1e7a92f69700af569f550b19d5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" podUID="ff9db296-6f02-44bf-810c-48cfb090036e" Dec 08 09:11:55 crc kubenswrapper[4776]: E1208 09:11:55.810069 4776 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7994656576-gzq75_openshift-operators_968acbdd-ab1d-4aa4-9db9-654170c5fa2d_0(91dd70ab7c72df1f7b9268f8849a68c78ff81c81105148ebe8f6d8ed38018246): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 08 09:11:55 crc kubenswrapper[4776]: E1208 09:11:55.810148 4776 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7994656576-gzq75_openshift-operators_968acbdd-ab1d-4aa4-9db9-654170c5fa2d_0(91dd70ab7c72df1f7b9268f8849a68c78ff81c81105148ebe8f6d8ed38018246): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" Dec 08 09:11:55 crc kubenswrapper[4776]: E1208 09:11:55.810196 4776 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7994656576-gzq75_openshift-operators_968acbdd-ab1d-4aa4-9db9-654170c5fa2d_0(91dd70ab7c72df1f7b9268f8849a68c78ff81c81105148ebe8f6d8ed38018246): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" Dec 08 09:11:55 crc kubenswrapper[4776]: E1208 09:11:55.810259 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7994656576-gzq75_openshift-operators(968acbdd-ab1d-4aa4-9db9-654170c5fa2d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7994656576-gzq75_openshift-operators(968acbdd-ab1d-4aa4-9db9-654170c5fa2d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7994656576-gzq75_openshift-operators_968acbdd-ab1d-4aa4-9db9-654170c5fa2d_0(91dd70ab7c72df1f7b9268f8849a68c78ff81c81105148ebe8f6d8ed38018246): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" podUID="968acbdd-ab1d-4aa4-9db9-654170c5fa2d" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.376244 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-rkf5k"] Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.376912 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.378754 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.379792 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-tzlxq" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.406772 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9108512a-718d-41db-b414-02665870be6b-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-rkf5k\" (UID: \"9108512a-718d-41db-b414-02665870be6b\") " pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.407103 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cjl2\" (UniqueName: \"kubernetes.io/projected/9108512a-718d-41db-b414-02665870be6b-kube-api-access-7cjl2\") pod \"observability-operator-d8bb48f5d-rkf5k\" (UID: \"9108512a-718d-41db-b414-02665870be6b\") " pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.508468 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9108512a-718d-41db-b414-02665870be6b-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-rkf5k\" (UID: \"9108512a-718d-41db-b414-02665870be6b\") " pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.508525 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cjl2\" (UniqueName: \"kubernetes.io/projected/9108512a-718d-41db-b414-02665870be6b-kube-api-access-7cjl2\") pod \"observability-operator-d8bb48f5d-rkf5k\" (UID: \"9108512a-718d-41db-b414-02665870be6b\") " pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.513864 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9108512a-718d-41db-b414-02665870be6b-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-rkf5k\" (UID: \"9108512a-718d-41db-b414-02665870be6b\") " pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.527096 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cjl2\" (UniqueName: \"kubernetes.io/projected/9108512a-718d-41db-b414-02665870be6b-kube-api-access-7cjl2\") pod \"observability-operator-d8bb48f5d-rkf5k\" (UID: \"9108512a-718d-41db-b414-02665870be6b\") " pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.547611 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-bc5qm"] Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.548340 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.551001 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-wrlwh" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.610382 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cp7l\" (UniqueName: \"kubernetes.io/projected/5691addb-538a-4212-bb5b-bf797ba7172c-kube-api-access-2cp7l\") pod \"perses-operator-5446b9c989-bc5qm\" (UID: \"5691addb-538a-4212-bb5b-bf797ba7172c\") " pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.610469 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5691addb-538a-4212-bb5b-bf797ba7172c-openshift-service-ca\") pod \"perses-operator-5446b9c989-bc5qm\" (UID: \"5691addb-538a-4212-bb5b-bf797ba7172c\") " pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.691199 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.712896 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cp7l\" (UniqueName: \"kubernetes.io/projected/5691addb-538a-4212-bb5b-bf797ba7172c-kube-api-access-2cp7l\") pod \"perses-operator-5446b9c989-bc5qm\" (UID: \"5691addb-538a-4212-bb5b-bf797ba7172c\") " pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.713006 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5691addb-538a-4212-bb5b-bf797ba7172c-openshift-service-ca\") pod \"perses-operator-5446b9c989-bc5qm\" (UID: \"5691addb-538a-4212-bb5b-bf797ba7172c\") " pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.713902 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5691addb-538a-4212-bb5b-bf797ba7172c-openshift-service-ca\") pod \"perses-operator-5446b9c989-bc5qm\" (UID: \"5691addb-538a-4212-bb5b-bf797ba7172c\") " pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:11:56 crc kubenswrapper[4776]: E1208 09:11:56.718443 4776 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-rkf5k_openshift-operators_9108512a-718d-41db-b414-02665870be6b_0(46cc10fd03a46739f0faa543d3a572fe8a0010259c4959747f956ecd27cac405): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 08 09:11:56 crc kubenswrapper[4776]: E1208 09:11:56.718578 4776 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-rkf5k_openshift-operators_9108512a-718d-41db-b414-02665870be6b_0(46cc10fd03a46739f0faa543d3a572fe8a0010259c4959747f956ecd27cac405): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:11:56 crc kubenswrapper[4776]: E1208 09:11:56.718612 4776 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-rkf5k_openshift-operators_9108512a-718d-41db-b414-02665870be6b_0(46cc10fd03a46739f0faa543d3a572fe8a0010259c4959747f956ecd27cac405): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:11:56 crc kubenswrapper[4776]: E1208 09:11:56.718670 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-rkf5k_openshift-operators(9108512a-718d-41db-b414-02665870be6b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-rkf5k_openshift-operators(9108512a-718d-41db-b414-02665870be6b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-rkf5k_openshift-operators_9108512a-718d-41db-b414-02665870be6b_0(46cc10fd03a46739f0faa543d3a572fe8a0010259c4959747f956ecd27cac405): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" podUID="9108512a-718d-41db-b414-02665870be6b" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.748930 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cp7l\" (UniqueName: \"kubernetes.io/projected/5691addb-538a-4212-bb5b-bf797ba7172c-kube-api-access-2cp7l\") pod \"perses-operator-5446b9c989-bc5qm\" (UID: \"5691addb-538a-4212-bb5b-bf797ba7172c\") " pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.863578 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:11:56 crc kubenswrapper[4776]: E1208 09:11:56.888861 4776 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-bc5qm_openshift-operators_5691addb-538a-4212-bb5b-bf797ba7172c_0(35b689929212b8ab29e94df07992788370cb784eac8673c278c651b65427d16a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 08 09:11:56 crc kubenswrapper[4776]: E1208 09:11:56.888921 4776 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-bc5qm_openshift-operators_5691addb-538a-4212-bb5b-bf797ba7172c_0(35b689929212b8ab29e94df07992788370cb784eac8673c278c651b65427d16a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:11:56 crc kubenswrapper[4776]: E1208 09:11:56.888944 4776 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-bc5qm_openshift-operators_5691addb-538a-4212-bb5b-bf797ba7172c_0(35b689929212b8ab29e94df07992788370cb784eac8673c278c651b65427d16a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:11:56 crc kubenswrapper[4776]: E1208 09:11:56.888986 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-bc5qm_openshift-operators(5691addb-538a-4212-bb5b-bf797ba7172c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-bc5qm_openshift-operators(5691addb-538a-4212-bb5b-bf797ba7172c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-bc5qm_openshift-operators_5691addb-538a-4212-bb5b-bf797ba7172c_0(35b689929212b8ab29e94df07992788370cb784eac8673c278c651b65427d16a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" podUID="5691addb-538a-4212-bb5b-bf797ba7172c" Dec 08 09:11:56 crc kubenswrapper[4776]: I1208 09:11:56.900294 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" event={"ID":"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5","Type":"ContainerStarted","Data":"fbe94059cb84bacbf4f6e2fa51bad37cabcc090532ca38347088f175425932ff"} Dec 08 09:11:58 crc kubenswrapper[4776]: I1208 09:11:58.931763 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" event={"ID":"e7d51fc9-4de6-49d6-ad0a-dd91c6f914e5","Type":"ContainerStarted","Data":"ebf905f6ebd0be9ba3d63a671036925d1ed42776b6060a4a2edb578507320317"} Dec 08 09:11:58 crc kubenswrapper[4776]: I1208 09:11:58.932953 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:58 crc kubenswrapper[4776]: I1208 09:11:58.932990 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:58 crc kubenswrapper[4776]: I1208 09:11:58.933036 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:58 crc kubenswrapper[4776]: I1208 09:11:58.966920 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:58 crc kubenswrapper[4776]: I1208 09:11:58.972587 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" podStartSLOduration=8.972571328 podStartE2EDuration="8.972571328s" podCreationTimestamp="2025-12-08 09:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:11:58.969215427 +0000 UTC m=+795.232440449" watchObservedRunningTime="2025-12-08 09:11:58.972571328 +0000 UTC m=+795.235796350" Dec 08 09:11:59 crc kubenswrapper[4776]: I1208 09:11:59.002211 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:11:59 crc kubenswrapper[4776]: I1208 09:11:59.725564 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv"] Dec 08 09:11:59 crc kubenswrapper[4776]: I1208 09:11:59.725945 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" Dec 08 09:11:59 crc kubenswrapper[4776]: I1208 09:11:59.726342 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" Dec 08 09:11:59 crc kubenswrapper[4776]: I1208 09:11:59.738110 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-rkf5k"] Dec 08 09:11:59 crc kubenswrapper[4776]: I1208 09:11:59.738229 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:11:59 crc kubenswrapper[4776]: I1208 09:11:59.738679 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:11:59 crc kubenswrapper[4776]: I1208 09:11:59.754898 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v"] Dec 08 09:11:59 crc kubenswrapper[4776]: I1208 09:11:59.755028 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" Dec 08 09:11:59 crc kubenswrapper[4776]: I1208 09:11:59.755507 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" Dec 08 09:11:59 crc kubenswrapper[4776]: I1208 09:11:59.777848 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75"] Dec 08 09:11:59 crc kubenswrapper[4776]: I1208 09:11:59.778024 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" Dec 08 09:11:59 crc kubenswrapper[4776]: I1208 09:11:59.778712 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.779180 4776 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7994656576-d6jvv_openshift-operators_ff9db296-6f02-44bf-810c-48cfb090036e_0(2edc555f332e848b3fb0bd267d5a7778935e25bab3a480a310443d49b17f41ce): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.779231 4776 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7994656576-d6jvv_openshift-operators_ff9db296-6f02-44bf-810c-48cfb090036e_0(2edc555f332e848b3fb0bd267d5a7778935e25bab3a480a310443d49b17f41ce): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.779251 4776 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7994656576-d6jvv_openshift-operators_ff9db296-6f02-44bf-810c-48cfb090036e_0(2edc555f332e848b3fb0bd267d5a7778935e25bab3a480a310443d49b17f41ce): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.779284 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7994656576-d6jvv_openshift-operators(ff9db296-6f02-44bf-810c-48cfb090036e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7994656576-d6jvv_openshift-operators(ff9db296-6f02-44bf-810c-48cfb090036e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7994656576-d6jvv_openshift-operators_ff9db296-6f02-44bf-810c-48cfb090036e_0(2edc555f332e848b3fb0bd267d5a7778935e25bab3a480a310443d49b17f41ce): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" podUID="ff9db296-6f02-44bf-810c-48cfb090036e" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.837971 4776 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-rkf5k_openshift-operators_9108512a-718d-41db-b414-02665870be6b_0(a5ed595641b62b3dca8d3da1fd0313362ec86a988eae842eabbd0a42b3f68bec): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.838042 4776 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-rkf5k_openshift-operators_9108512a-718d-41db-b414-02665870be6b_0(a5ed595641b62b3dca8d3da1fd0313362ec86a988eae842eabbd0a42b3f68bec): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.838081 4776 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-rkf5k_openshift-operators_9108512a-718d-41db-b414-02665870be6b_0(a5ed595641b62b3dca8d3da1fd0313362ec86a988eae842eabbd0a42b3f68bec): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.838132 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-rkf5k_openshift-operators(9108512a-718d-41db-b414-02665870be6b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-rkf5k_openshift-operators(9108512a-718d-41db-b414-02665870be6b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-rkf5k_openshift-operators_9108512a-718d-41db-b414-02665870be6b_0(a5ed595641b62b3dca8d3da1fd0313362ec86a988eae842eabbd0a42b3f68bec): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" podUID="9108512a-718d-41db-b414-02665870be6b" Dec 08 09:11:59 crc kubenswrapper[4776]: I1208 09:11:59.845621 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-bc5qm"] Dec 08 09:11:59 crc kubenswrapper[4776]: I1208 09:11:59.845749 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:11:59 crc kubenswrapper[4776]: I1208 09:11:59.846206 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.848479 4776 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7994656576-gzq75_openshift-operators_968acbdd-ab1d-4aa4-9db9-654170c5fa2d_0(d1d801c9633601d3de614c8a07bec5591803d0c34d32949159610ca1dbbe173b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.848556 4776 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7994656576-gzq75_openshift-operators_968acbdd-ab1d-4aa4-9db9-654170c5fa2d_0(d1d801c9633601d3de614c8a07bec5591803d0c34d32949159610ca1dbbe173b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.848579 4776 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7994656576-gzq75_openshift-operators_968acbdd-ab1d-4aa4-9db9-654170c5fa2d_0(d1d801c9633601d3de614c8a07bec5591803d0c34d32949159610ca1dbbe173b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.848626 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7994656576-gzq75_openshift-operators(968acbdd-ab1d-4aa4-9db9-654170c5fa2d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7994656576-gzq75_openshift-operators(968acbdd-ab1d-4aa4-9db9-654170c5fa2d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7994656576-gzq75_openshift-operators_968acbdd-ab1d-4aa4-9db9-654170c5fa2d_0(d1d801c9633601d3de614c8a07bec5591803d0c34d32949159610ca1dbbe173b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" podUID="968acbdd-ab1d-4aa4-9db9-654170c5fa2d" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.853481 4776 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-4r72v_openshift-operators_3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8_0(3bb46efd476b80afc1e73bc10d8b592842396305c96784b379a4ea1910702a47): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.853544 4776 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-4r72v_openshift-operators_3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8_0(3bb46efd476b80afc1e73bc10d8b592842396305c96784b379a4ea1910702a47): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.853566 4776 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-4r72v_openshift-operators_3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8_0(3bb46efd476b80afc1e73bc10d8b592842396305c96784b379a4ea1910702a47): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.853616 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-4r72v_openshift-operators(3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-4r72v_openshift-operators(3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-4r72v_openshift-operators_3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8_0(3bb46efd476b80afc1e73bc10d8b592842396305c96784b379a4ea1910702a47): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" podUID="3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.913253 4776 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-bc5qm_openshift-operators_5691addb-538a-4212-bb5b-bf797ba7172c_0(70883e8e64c7947776c339c190e6a0e9c3958156adbfb0f36d091b3a0f9a8d33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.913328 4776 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-bc5qm_openshift-operators_5691addb-538a-4212-bb5b-bf797ba7172c_0(70883e8e64c7947776c339c190e6a0e9c3958156adbfb0f36d091b3a0f9a8d33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.913363 4776 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-bc5qm_openshift-operators_5691addb-538a-4212-bb5b-bf797ba7172c_0(70883e8e64c7947776c339c190e6a0e9c3958156adbfb0f36d091b3a0f9a8d33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:11:59 crc kubenswrapper[4776]: E1208 09:11:59.913405 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-bc5qm_openshift-operators(5691addb-538a-4212-bb5b-bf797ba7172c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-bc5qm_openshift-operators(5691addb-538a-4212-bb5b-bf797ba7172c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-bc5qm_openshift-operators_5691addb-538a-4212-bb5b-bf797ba7172c_0(70883e8e64c7947776c339c190e6a0e9c3958156adbfb0f36d091b3a0f9a8d33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" podUID="5691addb-538a-4212-bb5b-bf797ba7172c" Dec 08 09:12:00 crc kubenswrapper[4776]: I1208 09:12:00.403651 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:12:00 crc kubenswrapper[4776]: I1208 09:12:00.462250 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:12:00 crc kubenswrapper[4776]: I1208 09:12:00.652644 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wlz88"] Dec 08 09:12:01 crc kubenswrapper[4776]: I1208 09:12:01.961847 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wlz88" podUID="57476415-2f48-4b7d-824d-61fd5702c5d6" containerName="registry-server" containerID="cri-o://e0718be14ed2fa3a7747a8348ffe2a6f8a60cbc25874ce3b8836bc4820347fb3" gracePeriod=2 Dec 08 09:12:02 crc kubenswrapper[4776]: I1208 09:12:02.975619 4776 generic.go:334] "Generic (PLEG): container finished" podID="57476415-2f48-4b7d-824d-61fd5702c5d6" containerID="e0718be14ed2fa3a7747a8348ffe2a6f8a60cbc25874ce3b8836bc4820347fb3" exitCode=0 Dec 08 09:12:02 crc kubenswrapper[4776]: I1208 09:12:02.975731 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlz88" event={"ID":"57476415-2f48-4b7d-824d-61fd5702c5d6","Type":"ContainerDied","Data":"e0718be14ed2fa3a7747a8348ffe2a6f8a60cbc25874ce3b8836bc4820347fb3"} Dec 08 09:12:03 crc kubenswrapper[4776]: I1208 09:12:03.080014 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:12:03 crc kubenswrapper[4776]: I1208 09:12:03.145845 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9hzn\" (UniqueName: \"kubernetes.io/projected/57476415-2f48-4b7d-824d-61fd5702c5d6-kube-api-access-w9hzn\") pod \"57476415-2f48-4b7d-824d-61fd5702c5d6\" (UID: \"57476415-2f48-4b7d-824d-61fd5702c5d6\") " Dec 08 09:12:03 crc kubenswrapper[4776]: I1208 09:12:03.145984 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57476415-2f48-4b7d-824d-61fd5702c5d6-utilities\") pod \"57476415-2f48-4b7d-824d-61fd5702c5d6\" (UID: \"57476415-2f48-4b7d-824d-61fd5702c5d6\") " Dec 08 09:12:03 crc kubenswrapper[4776]: I1208 09:12:03.146041 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57476415-2f48-4b7d-824d-61fd5702c5d6-catalog-content\") pod \"57476415-2f48-4b7d-824d-61fd5702c5d6\" (UID: \"57476415-2f48-4b7d-824d-61fd5702c5d6\") " Dec 08 09:12:03 crc kubenswrapper[4776]: I1208 09:12:03.159134 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57476415-2f48-4b7d-824d-61fd5702c5d6-utilities" (OuterVolumeSpecName: "utilities") pod "57476415-2f48-4b7d-824d-61fd5702c5d6" (UID: "57476415-2f48-4b7d-824d-61fd5702c5d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:12:03 crc kubenswrapper[4776]: I1208 09:12:03.183045 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57476415-2f48-4b7d-824d-61fd5702c5d6-kube-api-access-w9hzn" (OuterVolumeSpecName: "kube-api-access-w9hzn") pod "57476415-2f48-4b7d-824d-61fd5702c5d6" (UID: "57476415-2f48-4b7d-824d-61fd5702c5d6"). InnerVolumeSpecName "kube-api-access-w9hzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:12:03 crc kubenswrapper[4776]: I1208 09:12:03.250989 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57476415-2f48-4b7d-824d-61fd5702c5d6-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:12:03 crc kubenswrapper[4776]: I1208 09:12:03.251026 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9hzn\" (UniqueName: \"kubernetes.io/projected/57476415-2f48-4b7d-824d-61fd5702c5d6-kube-api-access-w9hzn\") on node \"crc\" DevicePath \"\"" Dec 08 09:12:03 crc kubenswrapper[4776]: I1208 09:12:03.338125 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57476415-2f48-4b7d-824d-61fd5702c5d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57476415-2f48-4b7d-824d-61fd5702c5d6" (UID: "57476415-2f48-4b7d-824d-61fd5702c5d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:12:03 crc kubenswrapper[4776]: I1208 09:12:03.352257 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57476415-2f48-4b7d-824d-61fd5702c5d6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:12:03 crc kubenswrapper[4776]: I1208 09:12:03.983991 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlz88" event={"ID":"57476415-2f48-4b7d-824d-61fd5702c5d6","Type":"ContainerDied","Data":"54d1b2b8e74c11fdfeb361f40ad082b446eb8a9b8d8dd20036c96360664df741"} Dec 08 09:12:03 crc kubenswrapper[4776]: I1208 09:12:03.984043 4776 scope.go:117] "RemoveContainer" containerID="e0718be14ed2fa3a7747a8348ffe2a6f8a60cbc25874ce3b8836bc4820347fb3" Dec 08 09:12:03 crc kubenswrapper[4776]: I1208 09:12:03.984047 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlz88" Dec 08 09:12:04 crc kubenswrapper[4776]: I1208 09:12:04.002485 4776 scope.go:117] "RemoveContainer" containerID="9dbc6298523724cb1b6943c6aa71c9c8b34e3a73b28e400846c7d8c15cf42512" Dec 08 09:12:04 crc kubenswrapper[4776]: I1208 09:12:04.021690 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wlz88"] Dec 08 09:12:04 crc kubenswrapper[4776]: I1208 09:12:04.026111 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wlz88"] Dec 08 09:12:04 crc kubenswrapper[4776]: I1208 09:12:04.026498 4776 scope.go:117] "RemoveContainer" containerID="c9a46a23dcf7ff8d0c458e79088c78de3b08576b9936d57e453d2414467d20ba" Dec 08 09:12:04 crc kubenswrapper[4776]: I1208 09:12:04.350874 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57476415-2f48-4b7d-824d-61fd5702c5d6" path="/var/lib/kubelet/pods/57476415-2f48-4b7d-824d-61fd5702c5d6/volumes" Dec 08 09:12:10 crc kubenswrapper[4776]: I1208 09:12:10.343562 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:12:10 crc kubenswrapper[4776]: I1208 09:12:10.344078 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:12:10 crc kubenswrapper[4776]: I1208 09:12:10.621824 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-rkf5k"] Dec 08 09:12:11 crc kubenswrapper[4776]: I1208 09:12:11.022904 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" event={"ID":"9108512a-718d-41db-b414-02665870be6b","Type":"ContainerStarted","Data":"f36090c2ba9fe787ee848a998b94ac2b3f3e7f9a1fb5fad2bb71942b4a3382cc"} Dec 08 09:12:11 crc kubenswrapper[4776]: I1208 09:12:11.343050 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" Dec 08 09:12:11 crc kubenswrapper[4776]: I1208 09:12:11.343833 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" Dec 08 09:12:11 crc kubenswrapper[4776]: I1208 09:12:11.399435 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:12:11 crc kubenswrapper[4776]: I1208 09:12:11.399528 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:12:11 crc kubenswrapper[4776]: I1208 09:12:11.399596 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 09:12:11 crc kubenswrapper[4776]: I1208 09:12:11.400678 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"abac38e42f2fdbb7423dde9370109f19a92ff63c4313fd19999ad68bdb72ed2b"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 09:12:11 crc kubenswrapper[4776]: I1208 09:12:11.400766 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://abac38e42f2fdbb7423dde9370109f19a92ff63c4313fd19999ad68bdb72ed2b" gracePeriod=600 Dec 08 09:12:11 crc kubenswrapper[4776]: I1208 09:12:11.563621 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75"] Dec 08 09:12:12 crc kubenswrapper[4776]: I1208 09:12:12.062892 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="abac38e42f2fdbb7423dde9370109f19a92ff63c4313fd19999ad68bdb72ed2b" exitCode=0 Dec 08 09:12:12 crc kubenswrapper[4776]: I1208 09:12:12.062979 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"abac38e42f2fdbb7423dde9370109f19a92ff63c4313fd19999ad68bdb72ed2b"} Dec 08 09:12:12 crc kubenswrapper[4776]: I1208 09:12:12.063872 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"be636680726361907bd5f0d2d58d00dbbd0c77d0144025e4fa0b6101666966a8"} Dec 08 09:12:12 crc kubenswrapper[4776]: I1208 09:12:12.063980 4776 scope.go:117] "RemoveContainer" containerID="60dbb3e7c44241db89caa5cb2272dfcb89d62fdbf75c7153dbb476fd01b77752" Dec 08 09:12:12 crc kubenswrapper[4776]: I1208 09:12:12.067311 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" event={"ID":"968acbdd-ab1d-4aa4-9db9-654170c5fa2d","Type":"ContainerStarted","Data":"e58ed917967951f93d055cc308ac049fb7bd086c0564e9bb15122b451c18fb3d"} Dec 08 09:12:12 crc kubenswrapper[4776]: I1208 09:12:12.343659 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" Dec 08 09:12:12 crc kubenswrapper[4776]: I1208 09:12:12.344050 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" Dec 08 09:12:12 crc kubenswrapper[4776]: I1208 09:12:12.344347 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" Dec 08 09:12:12 crc kubenswrapper[4776]: I1208 09:12:12.344535 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" Dec 08 09:12:12 crc kubenswrapper[4776]: I1208 09:12:12.647730 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv"] Dec 08 09:12:12 crc kubenswrapper[4776]: W1208 09:12:12.667328 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff9db296_6f02_44bf_810c_48cfb090036e.slice/crio-17ce098a324f3ebe901135d2320c2b887803b5bd093ab0b5ebee0db4133ea951 WatchSource:0}: Error finding container 17ce098a324f3ebe901135d2320c2b887803b5bd093ab0b5ebee0db4133ea951: Status 404 returned error can't find the container with id 17ce098a324f3ebe901135d2320c2b887803b5bd093ab0b5ebee0db4133ea951 Dec 08 09:12:12 crc kubenswrapper[4776]: I1208 09:12:12.984953 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v"] Dec 08 09:12:12 crc kubenswrapper[4776]: W1208 09:12:12.994280 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ddae09e_bcfe_4e98_bdd1_9ac94218a6d8.slice/crio-505d6485a28f528396119e8d943e437ff8b6385a1be93fa683ac629f0a17eaa8 WatchSource:0}: Error finding container 505d6485a28f528396119e8d943e437ff8b6385a1be93fa683ac629f0a17eaa8: Status 404 returned error can't find the container with id 505d6485a28f528396119e8d943e437ff8b6385a1be93fa683ac629f0a17eaa8 Dec 08 09:12:13 crc kubenswrapper[4776]: I1208 09:12:13.079460 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" event={"ID":"ff9db296-6f02-44bf-810c-48cfb090036e","Type":"ContainerStarted","Data":"17ce098a324f3ebe901135d2320c2b887803b5bd093ab0b5ebee0db4133ea951"} Dec 08 09:12:13 crc kubenswrapper[4776]: I1208 09:12:13.080725 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" event={"ID":"3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8","Type":"ContainerStarted","Data":"505d6485a28f528396119e8d943e437ff8b6385a1be93fa683ac629f0a17eaa8"} Dec 08 09:12:13 crc kubenswrapper[4776]: I1208 09:12:13.343052 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:12:13 crc kubenswrapper[4776]: I1208 09:12:13.343839 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:12:13 crc kubenswrapper[4776]: I1208 09:12:13.836143 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-bc5qm"] Dec 08 09:12:13 crc kubenswrapper[4776]: W1208 09:12:13.841940 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5691addb_538a_4212_bb5b_bf797ba7172c.slice/crio-f08ac59ccb83005b24e957646a57d975d71996bafb394523470813fd025068e0 WatchSource:0}: Error finding container f08ac59ccb83005b24e957646a57d975d71996bafb394523470813fd025068e0: Status 404 returned error can't find the container with id f08ac59ccb83005b24e957646a57d975d71996bafb394523470813fd025068e0 Dec 08 09:12:14 crc kubenswrapper[4776]: I1208 09:12:14.098424 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" event={"ID":"5691addb-538a-4212-bb5b-bf797ba7172c","Type":"ContainerStarted","Data":"f08ac59ccb83005b24e957646a57d975d71996bafb394523470813fd025068e0"} Dec 08 09:12:20 crc kubenswrapper[4776]: I1208 09:12:20.788899 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tgwm2" Dec 08 09:12:26 crc kubenswrapper[4776]: I1208 09:12:26.194622 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" event={"ID":"ff9db296-6f02-44bf-810c-48cfb090036e","Type":"ContainerStarted","Data":"bcf29ef5619f9555dd2beba4540cc5578224f158345f81855551f6943449a97a"} Dec 08 09:12:26 crc kubenswrapper[4776]: I1208 09:12:26.198407 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:12:26 crc kubenswrapper[4776]: I1208 09:12:26.200222 4776 patch_prober.go:28] interesting pod/observability-operator-d8bb48f5d-rkf5k container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.11:8081/healthz\": dial tcp 10.217.0.11:8081: connect: connection refused" start-of-body= Dec 08 09:12:26 crc kubenswrapper[4776]: I1208 09:12:26.200273 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" podUID="9108512a-718d-41db-b414-02665870be6b" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.11:8081/healthz\": dial tcp 10.217.0.11:8081: connect: connection refused" Dec 08 09:12:26 crc kubenswrapper[4776]: I1208 09:12:26.200917 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" event={"ID":"5691addb-538a-4212-bb5b-bf797ba7172c","Type":"ContainerStarted","Data":"a8ef86656eff7115cfa9c6edc00247e6fdd10fe6c8cf002730db2ef1b0cf1443"} Dec 08 09:12:26 crc kubenswrapper[4776]: I1208 09:12:26.201052 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:12:26 crc kubenswrapper[4776]: I1208 09:12:26.216696 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-d6jvv" podStartSLOduration=18.025894841 podStartE2EDuration="31.21667514s" podCreationTimestamp="2025-12-08 09:11:55 +0000 UTC" firstStartedPulling="2025-12-08 09:12:12.66887685 +0000 UTC m=+808.932101872" lastFinishedPulling="2025-12-08 09:12:25.859657149 +0000 UTC m=+822.122882171" observedRunningTime="2025-12-08 09:12:26.208253901 +0000 UTC m=+822.471478923" watchObservedRunningTime="2025-12-08 09:12:26.21667514 +0000 UTC m=+822.479900172" Dec 08 09:12:26 crc kubenswrapper[4776]: I1208 09:12:26.231505 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" podStartSLOduration=18.118039407 podStartE2EDuration="30.231486022s" podCreationTimestamp="2025-12-08 09:11:56 +0000 UTC" firstStartedPulling="2025-12-08 09:12:13.844460691 +0000 UTC m=+810.107685713" lastFinishedPulling="2025-12-08 09:12:25.957907306 +0000 UTC m=+822.221132328" observedRunningTime="2025-12-08 09:12:26.229501608 +0000 UTC m=+822.492726630" watchObservedRunningTime="2025-12-08 09:12:26.231486022 +0000 UTC m=+822.494711034" Dec 08 09:12:26 crc kubenswrapper[4776]: I1208 09:12:26.251239 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" podStartSLOduration=15.026369044 podStartE2EDuration="30.251221177s" podCreationTimestamp="2025-12-08 09:11:56 +0000 UTC" firstStartedPulling="2025-12-08 09:12:10.634726594 +0000 UTC m=+806.897951616" lastFinishedPulling="2025-12-08 09:12:25.859578717 +0000 UTC m=+822.122803749" observedRunningTime="2025-12-08 09:12:26.246709685 +0000 UTC m=+822.509934707" watchObservedRunningTime="2025-12-08 09:12:26.251221177 +0000 UTC m=+822.514446199" Dec 08 09:12:26 crc kubenswrapper[4776]: I1208 09:12:26.752112 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" Dec 08 09:12:27 crc kubenswrapper[4776]: I1208 09:12:27.207471 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-rkf5k" event={"ID":"9108512a-718d-41db-b414-02665870be6b","Type":"ContainerStarted","Data":"a3414d641e2ea6d7fb2edb9b2eb42d2be851ab1aa8911bf5567febeacb485b0d"} Dec 08 09:12:27 crc kubenswrapper[4776]: I1208 09:12:27.208977 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" event={"ID":"3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8","Type":"ContainerStarted","Data":"743939c503ba8ae18416a5d4e3705888d773ddd49f5b6c958431f92b2596ab79"} Dec 08 09:12:27 crc kubenswrapper[4776]: I1208 09:12:27.210310 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" event={"ID":"968acbdd-ab1d-4aa4-9db9-654170c5fa2d","Type":"ContainerStarted","Data":"1dcd46119fef645b700ff8cedd01f8b9f8b15b60505b7eec8bd38116810140ca"} Dec 08 09:12:27 crc kubenswrapper[4776]: I1208 09:12:27.225252 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4r72v" podStartSLOduration=19.362977975 podStartE2EDuration="32.225231577s" podCreationTimestamp="2025-12-08 09:11:55 +0000 UTC" firstStartedPulling="2025-12-08 09:12:12.997087389 +0000 UTC m=+809.260312411" lastFinishedPulling="2025-12-08 09:12:25.859340971 +0000 UTC m=+822.122566013" observedRunningTime="2025-12-08 09:12:27.22460038 +0000 UTC m=+823.487825422" watchObservedRunningTime="2025-12-08 09:12:27.225231577 +0000 UTC m=+823.488456599" Dec 08 09:12:27 crc kubenswrapper[4776]: I1208 09:12:27.243816 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7994656576-gzq75" podStartSLOduration=17.964616638 podStartE2EDuration="32.243794891s" podCreationTimestamp="2025-12-08 09:11:55 +0000 UTC" firstStartedPulling="2025-12-08 09:12:11.580368453 +0000 UTC m=+807.843593475" lastFinishedPulling="2025-12-08 09:12:25.859546706 +0000 UTC m=+822.122771728" observedRunningTime="2025-12-08 09:12:27.240560694 +0000 UTC m=+823.503785716" watchObservedRunningTime="2025-12-08 09:12:27.243794891 +0000 UTC m=+823.507019913" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.225688 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-6vnhc"] Dec 08 09:12:36 crc kubenswrapper[4776]: E1208 09:12:36.227269 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57476415-2f48-4b7d-824d-61fd5702c5d6" containerName="extract-utilities" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.227359 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="57476415-2f48-4b7d-824d-61fd5702c5d6" containerName="extract-utilities" Dec 08 09:12:36 crc kubenswrapper[4776]: E1208 09:12:36.227426 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57476415-2f48-4b7d-824d-61fd5702c5d6" containerName="registry-server" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.227517 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="57476415-2f48-4b7d-824d-61fd5702c5d6" containerName="registry-server" Dec 08 09:12:36 crc kubenswrapper[4776]: E1208 09:12:36.227599 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57476415-2f48-4b7d-824d-61fd5702c5d6" containerName="extract-content" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.227664 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="57476415-2f48-4b7d-824d-61fd5702c5d6" containerName="extract-content" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.227831 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="57476415-2f48-4b7d-824d-61fd5702c5d6" containerName="registry-server" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.228323 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-6vnhc" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.231033 4776 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-78mxv" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.231048 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.231091 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.235156 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-6vnhc"] Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.267423 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-hsqbv"] Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.275644 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-hsqbv"] Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.275759 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-hsqbv" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.281485 4776 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-zwssg" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.308364 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-6jjjz"] Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.309244 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-6jjjz" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.311319 4776 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pk2s8" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.314990 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvrzp\" (UniqueName: \"kubernetes.io/projected/8b23f8e2-638b-438a-8363-8daf30f656e6-kube-api-access-wvrzp\") pod \"cert-manager-cainjector-7f985d654d-6vnhc\" (UID: \"8b23f8e2-638b-438a-8363-8daf30f656e6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-6vnhc" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.324109 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-6jjjz"] Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.416719 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwkqq\" (UniqueName: \"kubernetes.io/projected/69b03c85-8503-44fc-9e71-0357ce0cc56e-kube-api-access-wwkqq\") pod \"cert-manager-5b446d88c5-hsqbv\" (UID: \"69b03c85-8503-44fc-9e71-0357ce0cc56e\") " pod="cert-manager/cert-manager-5b446d88c5-hsqbv" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.416804 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvrzp\" (UniqueName: \"kubernetes.io/projected/8b23f8e2-638b-438a-8363-8daf30f656e6-kube-api-access-wvrzp\") pod \"cert-manager-cainjector-7f985d654d-6vnhc\" (UID: \"8b23f8e2-638b-438a-8363-8daf30f656e6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-6vnhc" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.416821 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l9c7\" (UniqueName: \"kubernetes.io/projected/4f663316-a0ef-44bd-a068-47f3e7d37a5c-kube-api-access-8l9c7\") pod \"cert-manager-webhook-5655c58dd6-6jjjz\" (UID: \"4f663316-a0ef-44bd-a068-47f3e7d37a5c\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-6jjjz" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.436150 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvrzp\" (UniqueName: \"kubernetes.io/projected/8b23f8e2-638b-438a-8363-8daf30f656e6-kube-api-access-wvrzp\") pod \"cert-manager-cainjector-7f985d654d-6vnhc\" (UID: \"8b23f8e2-638b-438a-8363-8daf30f656e6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-6vnhc" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.518606 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwkqq\" (UniqueName: \"kubernetes.io/projected/69b03c85-8503-44fc-9e71-0357ce0cc56e-kube-api-access-wwkqq\") pod \"cert-manager-5b446d88c5-hsqbv\" (UID: \"69b03c85-8503-44fc-9e71-0357ce0cc56e\") " pod="cert-manager/cert-manager-5b446d88c5-hsqbv" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.518680 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l9c7\" (UniqueName: \"kubernetes.io/projected/4f663316-a0ef-44bd-a068-47f3e7d37a5c-kube-api-access-8l9c7\") pod \"cert-manager-webhook-5655c58dd6-6jjjz\" (UID: \"4f663316-a0ef-44bd-a068-47f3e7d37a5c\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-6jjjz" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.535888 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l9c7\" (UniqueName: \"kubernetes.io/projected/4f663316-a0ef-44bd-a068-47f3e7d37a5c-kube-api-access-8l9c7\") pod \"cert-manager-webhook-5655c58dd6-6jjjz\" (UID: \"4f663316-a0ef-44bd-a068-47f3e7d37a5c\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-6jjjz" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.546737 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-6vnhc" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.557741 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwkqq\" (UniqueName: \"kubernetes.io/projected/69b03c85-8503-44fc-9e71-0357ce0cc56e-kube-api-access-wwkqq\") pod \"cert-manager-5b446d88c5-hsqbv\" (UID: \"69b03c85-8503-44fc-9e71-0357ce0cc56e\") " pod="cert-manager/cert-manager-5b446d88c5-hsqbv" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.600513 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-hsqbv" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.632441 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-6jjjz" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.867229 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-bc5qm" Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.941568 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-hsqbv"] Dec 08 09:12:36 crc kubenswrapper[4776]: I1208 09:12:36.988449 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-6vnhc"] Dec 08 09:12:36 crc kubenswrapper[4776]: W1208 09:12:36.990614 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b23f8e2_638b_438a_8363_8daf30f656e6.slice/crio-ca5ded1e737f4b1866cad65f5b16db98211bfb96fd0883af57d3c34006b0ded1 WatchSource:0}: Error finding container ca5ded1e737f4b1866cad65f5b16db98211bfb96fd0883af57d3c34006b0ded1: Status 404 returned error can't find the container with id ca5ded1e737f4b1866cad65f5b16db98211bfb96fd0883af57d3c34006b0ded1 Dec 08 09:12:37 crc kubenswrapper[4776]: I1208 09:12:37.246651 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-6jjjz"] Dec 08 09:12:37 crc kubenswrapper[4776]: W1208 09:12:37.250357 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f663316_a0ef_44bd_a068_47f3e7d37a5c.slice/crio-aac9cba1e097c5ca53dfe0eb38749b1cd5242e5a9de4be9e26381258d074f699 WatchSource:0}: Error finding container aac9cba1e097c5ca53dfe0eb38749b1cd5242e5a9de4be9e26381258d074f699: Status 404 returned error can't find the container with id aac9cba1e097c5ca53dfe0eb38749b1cd5242e5a9de4be9e26381258d074f699 Dec 08 09:12:37 crc kubenswrapper[4776]: I1208 09:12:37.298786 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-6jjjz" event={"ID":"4f663316-a0ef-44bd-a068-47f3e7d37a5c","Type":"ContainerStarted","Data":"aac9cba1e097c5ca53dfe0eb38749b1cd5242e5a9de4be9e26381258d074f699"} Dec 08 09:12:37 crc kubenswrapper[4776]: I1208 09:12:37.299531 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-6vnhc" event={"ID":"8b23f8e2-638b-438a-8363-8daf30f656e6","Type":"ContainerStarted","Data":"ca5ded1e737f4b1866cad65f5b16db98211bfb96fd0883af57d3c34006b0ded1"} Dec 08 09:12:37 crc kubenswrapper[4776]: I1208 09:12:37.301341 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-hsqbv" event={"ID":"69b03c85-8503-44fc-9e71-0357ce0cc56e","Type":"ContainerStarted","Data":"247ebfeccb4d38f1210e00d4db28540be18bf01a81e1c75b4a318667fb1c6694"} Dec 08 09:12:42 crc kubenswrapper[4776]: I1208 09:12:42.332911 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-6vnhc" event={"ID":"8b23f8e2-638b-438a-8363-8daf30f656e6","Type":"ContainerStarted","Data":"b7d5ce3f74241fb978070c5953659c7dc4540c55d9723dfb5bbc582fa3b8ebef"} Dec 08 09:12:42 crc kubenswrapper[4776]: I1208 09:12:42.334366 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-hsqbv" event={"ID":"69b03c85-8503-44fc-9e71-0357ce0cc56e","Type":"ContainerStarted","Data":"487f9f115090a53923f8b5937d144813b690a163b688a6c225b87693602bce0c"} Dec 08 09:12:42 crc kubenswrapper[4776]: I1208 09:12:42.335630 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-6jjjz" event={"ID":"4f663316-a0ef-44bd-a068-47f3e7d37a5c","Type":"ContainerStarted","Data":"29aabf41bee9aab62d1ebd904c6445eee7a3dea3c4218ff2fe2e6ffb15213af6"} Dec 08 09:12:42 crc kubenswrapper[4776]: I1208 09:12:42.335786 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-6jjjz" Dec 08 09:12:42 crc kubenswrapper[4776]: I1208 09:12:42.383993 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-6vnhc" podStartSLOduration=1.582733718 podStartE2EDuration="6.383977127s" podCreationTimestamp="2025-12-08 09:12:36 +0000 UTC" firstStartedPulling="2025-12-08 09:12:36.993095133 +0000 UTC m=+833.256320145" lastFinishedPulling="2025-12-08 09:12:41.794338532 +0000 UTC m=+838.057563554" observedRunningTime="2025-12-08 09:12:42.362604087 +0000 UTC m=+838.625829099" watchObservedRunningTime="2025-12-08 09:12:42.383977127 +0000 UTC m=+838.647202139" Dec 08 09:12:42 crc kubenswrapper[4776]: I1208 09:12:42.384077 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-6jjjz" podStartSLOduration=1.848720577 podStartE2EDuration="6.384074499s" podCreationTimestamp="2025-12-08 09:12:36 +0000 UTC" firstStartedPulling="2025-12-08 09:12:37.252749121 +0000 UTC m=+833.515974143" lastFinishedPulling="2025-12-08 09:12:41.788103043 +0000 UTC m=+838.051328065" observedRunningTime="2025-12-08 09:12:42.380730599 +0000 UTC m=+838.643955631" watchObservedRunningTime="2025-12-08 09:12:42.384074499 +0000 UTC m=+838.647299521" Dec 08 09:12:42 crc kubenswrapper[4776]: I1208 09:12:42.401911 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-hsqbv" podStartSLOduration=1.577032453 podStartE2EDuration="6.401888833s" podCreationTimestamp="2025-12-08 09:12:36 +0000 UTC" firstStartedPulling="2025-12-08 09:12:36.960040685 +0000 UTC m=+833.223265707" lastFinishedPulling="2025-12-08 09:12:41.784897065 +0000 UTC m=+838.048122087" observedRunningTime="2025-12-08 09:12:42.39735407 +0000 UTC m=+838.660579092" watchObservedRunningTime="2025-12-08 09:12:42.401888833 +0000 UTC m=+838.665113855" Dec 08 09:12:51 crc kubenswrapper[4776]: I1208 09:12:51.644002 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-6jjjz" Dec 08 09:13:20 crc kubenswrapper[4776]: I1208 09:13:20.656869 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k"] Dec 08 09:13:20 crc kubenswrapper[4776]: I1208 09:13:20.658537 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" Dec 08 09:13:20 crc kubenswrapper[4776]: I1208 09:13:20.660776 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 08 09:13:20 crc kubenswrapper[4776]: I1208 09:13:20.676508 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k"] Dec 08 09:13:20 crc kubenswrapper[4776]: I1208 09:13:20.788366 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b36dfd9-c3b8-4858-b056-70d04434052a-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k\" (UID: \"0b36dfd9-c3b8-4858-b056-70d04434052a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" Dec 08 09:13:20 crc kubenswrapper[4776]: I1208 09:13:20.788907 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq4kk\" (UniqueName: \"kubernetes.io/projected/0b36dfd9-c3b8-4858-b056-70d04434052a-kube-api-access-mq4kk\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k\" (UID: \"0b36dfd9-c3b8-4858-b056-70d04434052a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" Dec 08 09:13:20 crc kubenswrapper[4776]: I1208 09:13:20.789023 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b36dfd9-c3b8-4858-b056-70d04434052a-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k\" (UID: \"0b36dfd9-c3b8-4858-b056-70d04434052a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" Dec 08 09:13:20 crc kubenswrapper[4776]: I1208 09:13:20.890132 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b36dfd9-c3b8-4858-b056-70d04434052a-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k\" (UID: \"0b36dfd9-c3b8-4858-b056-70d04434052a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" Dec 08 09:13:20 crc kubenswrapper[4776]: I1208 09:13:20.890253 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b36dfd9-c3b8-4858-b056-70d04434052a-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k\" (UID: \"0b36dfd9-c3b8-4858-b056-70d04434052a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" Dec 08 09:13:20 crc kubenswrapper[4776]: I1208 09:13:20.890292 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq4kk\" (UniqueName: \"kubernetes.io/projected/0b36dfd9-c3b8-4858-b056-70d04434052a-kube-api-access-mq4kk\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k\" (UID: \"0b36dfd9-c3b8-4858-b056-70d04434052a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" Dec 08 09:13:20 crc kubenswrapper[4776]: I1208 09:13:20.890569 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b36dfd9-c3b8-4858-b056-70d04434052a-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k\" (UID: \"0b36dfd9-c3b8-4858-b056-70d04434052a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" Dec 08 09:13:20 crc kubenswrapper[4776]: I1208 09:13:20.890706 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b36dfd9-c3b8-4858-b056-70d04434052a-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k\" (UID: \"0b36dfd9-c3b8-4858-b056-70d04434052a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" Dec 08 09:13:20 crc kubenswrapper[4776]: I1208 09:13:20.924999 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq4kk\" (UniqueName: \"kubernetes.io/projected/0b36dfd9-c3b8-4858-b056-70d04434052a-kube-api-access-mq4kk\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k\" (UID: \"0b36dfd9-c3b8-4858-b056-70d04434052a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" Dec 08 09:13:20 crc kubenswrapper[4776]: I1208 09:13:20.979479 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" Dec 08 09:13:21 crc kubenswrapper[4776]: I1208 09:13:21.075139 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww"] Dec 08 09:13:21 crc kubenswrapper[4776]: I1208 09:13:21.076607 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" Dec 08 09:13:21 crc kubenswrapper[4776]: I1208 09:13:21.084781 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww"] Dec 08 09:13:21 crc kubenswrapper[4776]: I1208 09:13:21.195271 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wljwv\" (UniqueName: \"kubernetes.io/projected/20cd1aea-6a8d-458a-8697-f9193cfa6058-kube-api-access-wljwv\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww\" (UID: \"20cd1aea-6a8d-458a-8697-f9193cfa6058\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" Dec 08 09:13:21 crc kubenswrapper[4776]: I1208 09:13:21.195328 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20cd1aea-6a8d-458a-8697-f9193cfa6058-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww\" (UID: \"20cd1aea-6a8d-458a-8697-f9193cfa6058\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" Dec 08 09:13:21 crc kubenswrapper[4776]: I1208 09:13:21.195376 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20cd1aea-6a8d-458a-8697-f9193cfa6058-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww\" (UID: \"20cd1aea-6a8d-458a-8697-f9193cfa6058\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" Dec 08 09:13:21 crc kubenswrapper[4776]: I1208 09:13:21.296188 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wljwv\" (UniqueName: \"kubernetes.io/projected/20cd1aea-6a8d-458a-8697-f9193cfa6058-kube-api-access-wljwv\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww\" (UID: \"20cd1aea-6a8d-458a-8697-f9193cfa6058\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" Dec 08 09:13:21 crc kubenswrapper[4776]: I1208 09:13:21.296479 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20cd1aea-6a8d-458a-8697-f9193cfa6058-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww\" (UID: \"20cd1aea-6a8d-458a-8697-f9193cfa6058\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" Dec 08 09:13:21 crc kubenswrapper[4776]: I1208 09:13:21.296537 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20cd1aea-6a8d-458a-8697-f9193cfa6058-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww\" (UID: \"20cd1aea-6a8d-458a-8697-f9193cfa6058\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" Dec 08 09:13:21 crc kubenswrapper[4776]: I1208 09:13:21.296899 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20cd1aea-6a8d-458a-8697-f9193cfa6058-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww\" (UID: \"20cd1aea-6a8d-458a-8697-f9193cfa6058\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" Dec 08 09:13:21 crc kubenswrapper[4776]: I1208 09:13:21.296919 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20cd1aea-6a8d-458a-8697-f9193cfa6058-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww\" (UID: \"20cd1aea-6a8d-458a-8697-f9193cfa6058\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" Dec 08 09:13:21 crc kubenswrapper[4776]: I1208 09:13:21.314238 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wljwv\" (UniqueName: \"kubernetes.io/projected/20cd1aea-6a8d-458a-8697-f9193cfa6058-kube-api-access-wljwv\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww\" (UID: \"20cd1aea-6a8d-458a-8697-f9193cfa6058\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" Dec 08 09:13:21 crc kubenswrapper[4776]: I1208 09:13:21.403024 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" Dec 08 09:13:21 crc kubenswrapper[4776]: I1208 09:13:21.452681 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k"] Dec 08 09:13:21 crc kubenswrapper[4776]: I1208 09:13:21.598201 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" event={"ID":"0b36dfd9-c3b8-4858-b056-70d04434052a","Type":"ContainerStarted","Data":"b51f68808128df2dd37aa0b220a0aaf5c9b263f4481bec658f230b0ed6cc6532"} Dec 08 09:13:21 crc kubenswrapper[4776]: I1208 09:13:21.806860 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww"] Dec 08 09:13:21 crc kubenswrapper[4776]: W1208 09:13:21.813783 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20cd1aea_6a8d_458a_8697_f9193cfa6058.slice/crio-0078d5a54c587a1236b97f4b733878dc02471fac47d3dc96f2b949ed4d4ade55 WatchSource:0}: Error finding container 0078d5a54c587a1236b97f4b733878dc02471fac47d3dc96f2b949ed4d4ade55: Status 404 returned error can't find the container with id 0078d5a54c587a1236b97f4b733878dc02471fac47d3dc96f2b949ed4d4ade55 Dec 08 09:13:22 crc kubenswrapper[4776]: I1208 09:13:22.604686 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" event={"ID":"20cd1aea-6a8d-458a-8697-f9193cfa6058","Type":"ContainerStarted","Data":"0078d5a54c587a1236b97f4b733878dc02471fac47d3dc96f2b949ed4d4ade55"} Dec 08 09:13:22 crc kubenswrapper[4776]: I1208 09:13:22.605831 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" event={"ID":"0b36dfd9-c3b8-4858-b056-70d04434052a","Type":"ContainerStarted","Data":"58e4de335bbf8d0b895504caef3bf53d063cf1d47cc1751e4ec832707d88026e"} Dec 08 09:13:22 crc kubenswrapper[4776]: E1208 09:13:22.971957 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b36dfd9_c3b8_4858_b056_70d04434052a.slice/crio-conmon-58e4de335bbf8d0b895504caef3bf53d063cf1d47cc1751e4ec832707d88026e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20cd1aea_6a8d_458a_8697_f9193cfa6058.slice/crio-cd1717862e3aa65c964e329c97fe382c598ebf6c2cda98bba4b8fba3bd8fc11a.scope\": RecentStats: unable to find data in memory cache]" Dec 08 09:13:23 crc kubenswrapper[4776]: I1208 09:13:23.612950 4776 generic.go:334] "Generic (PLEG): container finished" podID="0b36dfd9-c3b8-4858-b056-70d04434052a" containerID="58e4de335bbf8d0b895504caef3bf53d063cf1d47cc1751e4ec832707d88026e" exitCode=0 Dec 08 09:13:23 crc kubenswrapper[4776]: I1208 09:13:23.613065 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" event={"ID":"0b36dfd9-c3b8-4858-b056-70d04434052a","Type":"ContainerDied","Data":"58e4de335bbf8d0b895504caef3bf53d063cf1d47cc1751e4ec832707d88026e"} Dec 08 09:13:23 crc kubenswrapper[4776]: I1208 09:13:23.615410 4776 generic.go:334] "Generic (PLEG): container finished" podID="20cd1aea-6a8d-458a-8697-f9193cfa6058" containerID="cd1717862e3aa65c964e329c97fe382c598ebf6c2cda98bba4b8fba3bd8fc11a" exitCode=0 Dec 08 09:13:23 crc kubenswrapper[4776]: I1208 09:13:23.615445 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" event={"ID":"20cd1aea-6a8d-458a-8697-f9193cfa6058","Type":"ContainerDied","Data":"cd1717862e3aa65c964e329c97fe382c598ebf6c2cda98bba4b8fba3bd8fc11a"} Dec 08 09:13:25 crc kubenswrapper[4776]: I1208 09:13:25.647527 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" event={"ID":"0b36dfd9-c3b8-4858-b056-70d04434052a","Type":"ContainerStarted","Data":"807a0e6dbcd155b0bf35cfbe6a35880991f3027574b701100098e410d8b130df"} Dec 08 09:13:26 crc kubenswrapper[4776]: I1208 09:13:26.655768 4776 generic.go:334] "Generic (PLEG): container finished" podID="0b36dfd9-c3b8-4858-b056-70d04434052a" containerID="807a0e6dbcd155b0bf35cfbe6a35880991f3027574b701100098e410d8b130df" exitCode=0 Dec 08 09:13:26 crc kubenswrapper[4776]: I1208 09:13:26.655812 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" event={"ID":"0b36dfd9-c3b8-4858-b056-70d04434052a","Type":"ContainerDied","Data":"807a0e6dbcd155b0bf35cfbe6a35880991f3027574b701100098e410d8b130df"} Dec 08 09:13:27 crc kubenswrapper[4776]: I1208 09:13:27.663186 4776 generic.go:334] "Generic (PLEG): container finished" podID="0b36dfd9-c3b8-4858-b056-70d04434052a" containerID="de8b2ca69ef04430e1954c5bbda69a9c6302eaa0d7cf3f22668f5b95b61e2c06" exitCode=0 Dec 08 09:13:27 crc kubenswrapper[4776]: I1208 09:13:27.663234 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" event={"ID":"0b36dfd9-c3b8-4858-b056-70d04434052a","Type":"ContainerDied","Data":"de8b2ca69ef04430e1954c5bbda69a9c6302eaa0d7cf3f22668f5b95b61e2c06"} Dec 08 09:13:28 crc kubenswrapper[4776]: I1208 09:13:28.907699 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" Dec 08 09:13:29 crc kubenswrapper[4776]: I1208 09:13:29.007905 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b36dfd9-c3b8-4858-b056-70d04434052a-util\") pod \"0b36dfd9-c3b8-4858-b056-70d04434052a\" (UID: \"0b36dfd9-c3b8-4858-b056-70d04434052a\") " Dec 08 09:13:29 crc kubenswrapper[4776]: I1208 09:13:29.007966 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq4kk\" (UniqueName: \"kubernetes.io/projected/0b36dfd9-c3b8-4858-b056-70d04434052a-kube-api-access-mq4kk\") pod \"0b36dfd9-c3b8-4858-b056-70d04434052a\" (UID: \"0b36dfd9-c3b8-4858-b056-70d04434052a\") " Dec 08 09:13:29 crc kubenswrapper[4776]: I1208 09:13:29.008040 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b36dfd9-c3b8-4858-b056-70d04434052a-bundle\") pod \"0b36dfd9-c3b8-4858-b056-70d04434052a\" (UID: \"0b36dfd9-c3b8-4858-b056-70d04434052a\") " Dec 08 09:13:29 crc kubenswrapper[4776]: I1208 09:13:29.008994 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b36dfd9-c3b8-4858-b056-70d04434052a-bundle" (OuterVolumeSpecName: "bundle") pod "0b36dfd9-c3b8-4858-b056-70d04434052a" (UID: "0b36dfd9-c3b8-4858-b056-70d04434052a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:13:29 crc kubenswrapper[4776]: I1208 09:13:29.013650 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b36dfd9-c3b8-4858-b056-70d04434052a-kube-api-access-mq4kk" (OuterVolumeSpecName: "kube-api-access-mq4kk") pod "0b36dfd9-c3b8-4858-b056-70d04434052a" (UID: "0b36dfd9-c3b8-4858-b056-70d04434052a"). InnerVolumeSpecName "kube-api-access-mq4kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:13:29 crc kubenswrapper[4776]: I1208 09:13:29.017946 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b36dfd9-c3b8-4858-b056-70d04434052a-util" (OuterVolumeSpecName: "util") pod "0b36dfd9-c3b8-4858-b056-70d04434052a" (UID: "0b36dfd9-c3b8-4858-b056-70d04434052a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:13:29 crc kubenswrapper[4776]: I1208 09:13:29.109596 4776 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b36dfd9-c3b8-4858-b056-70d04434052a-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:13:29 crc kubenswrapper[4776]: I1208 09:13:29.109636 4776 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b36dfd9-c3b8-4858-b056-70d04434052a-util\") on node \"crc\" DevicePath \"\"" Dec 08 09:13:29 crc kubenswrapper[4776]: I1208 09:13:29.109648 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq4kk\" (UniqueName: \"kubernetes.io/projected/0b36dfd9-c3b8-4858-b056-70d04434052a-kube-api-access-mq4kk\") on node \"crc\" DevicePath \"\"" Dec 08 09:13:29 crc kubenswrapper[4776]: I1208 09:13:29.677282 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" event={"ID":"0b36dfd9-c3b8-4858-b056-70d04434052a","Type":"ContainerDied","Data":"b51f68808128df2dd37aa0b220a0aaf5c9b263f4481bec658f230b0ed6cc6532"} Dec 08 09:13:29 crc kubenswrapper[4776]: I1208 09:13:29.677319 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b51f68808128df2dd37aa0b220a0aaf5c9b263f4481bec658f230b0ed6cc6532" Dec 08 09:13:29 crc kubenswrapper[4776]: I1208 09:13:29.677369 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k" Dec 08 09:13:35 crc kubenswrapper[4776]: I1208 09:13:35.720898 4776 generic.go:334] "Generic (PLEG): container finished" podID="20cd1aea-6a8d-458a-8697-f9193cfa6058" containerID="c0dd49b33d4e72aa6f30e28372e63a5815620161f4b0884dfab6e10297da2614" exitCode=0 Dec 08 09:13:35 crc kubenswrapper[4776]: I1208 09:13:35.720940 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" event={"ID":"20cd1aea-6a8d-458a-8697-f9193cfa6058","Type":"ContainerDied","Data":"c0dd49b33d4e72aa6f30e28372e63a5815620161f4b0884dfab6e10297da2614"} Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.572213 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h"] Dec 08 09:13:36 crc kubenswrapper[4776]: E1208 09:13:36.572456 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b36dfd9-c3b8-4858-b056-70d04434052a" containerName="pull" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.572482 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b36dfd9-c3b8-4858-b056-70d04434052a" containerName="pull" Dec 08 09:13:36 crc kubenswrapper[4776]: E1208 09:13:36.572493 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b36dfd9-c3b8-4858-b056-70d04434052a" containerName="extract" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.572499 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b36dfd9-c3b8-4858-b056-70d04434052a" containerName="extract" Dec 08 09:13:36 crc kubenswrapper[4776]: E1208 09:13:36.572510 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b36dfd9-c3b8-4858-b056-70d04434052a" containerName="util" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.572516 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b36dfd9-c3b8-4858-b056-70d04434052a" containerName="util" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.572645 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b36dfd9-c3b8-4858-b056-70d04434052a" containerName="extract" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.573264 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.577583 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.577653 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-dmdwl" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.577821 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.578071 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.578262 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.578330 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.594227 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h"] Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.607253 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzwhz\" (UniqueName: \"kubernetes.io/projected/9ba0d9e5-f1ab-40a6-9490-57ce8566843a-kube-api-access-vzwhz\") pod \"loki-operator-controller-manager-6bfc99889d-tq44h\" (UID: \"9ba0d9e5-f1ab-40a6-9490-57ce8566843a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.607290 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ba0d9e5-f1ab-40a6-9490-57ce8566843a-webhook-cert\") pod \"loki-operator-controller-manager-6bfc99889d-tq44h\" (UID: \"9ba0d9e5-f1ab-40a6-9490-57ce8566843a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.607334 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ba0d9e5-f1ab-40a6-9490-57ce8566843a-apiservice-cert\") pod \"loki-operator-controller-manager-6bfc99889d-tq44h\" (UID: \"9ba0d9e5-f1ab-40a6-9490-57ce8566843a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.607352 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9ba0d9e5-f1ab-40a6-9490-57ce8566843a-manager-config\") pod \"loki-operator-controller-manager-6bfc99889d-tq44h\" (UID: \"9ba0d9e5-f1ab-40a6-9490-57ce8566843a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.607403 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ba0d9e5-f1ab-40a6-9490-57ce8566843a-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6bfc99889d-tq44h\" (UID: \"9ba0d9e5-f1ab-40a6-9490-57ce8566843a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.708521 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ba0d9e5-f1ab-40a6-9490-57ce8566843a-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6bfc99889d-tq44h\" (UID: \"9ba0d9e5-f1ab-40a6-9490-57ce8566843a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.709604 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzwhz\" (UniqueName: \"kubernetes.io/projected/9ba0d9e5-f1ab-40a6-9490-57ce8566843a-kube-api-access-vzwhz\") pod \"loki-operator-controller-manager-6bfc99889d-tq44h\" (UID: \"9ba0d9e5-f1ab-40a6-9490-57ce8566843a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.709630 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ba0d9e5-f1ab-40a6-9490-57ce8566843a-webhook-cert\") pod \"loki-operator-controller-manager-6bfc99889d-tq44h\" (UID: \"9ba0d9e5-f1ab-40a6-9490-57ce8566843a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.709671 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ba0d9e5-f1ab-40a6-9490-57ce8566843a-apiservice-cert\") pod \"loki-operator-controller-manager-6bfc99889d-tq44h\" (UID: \"9ba0d9e5-f1ab-40a6-9490-57ce8566843a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.709690 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9ba0d9e5-f1ab-40a6-9490-57ce8566843a-manager-config\") pod \"loki-operator-controller-manager-6bfc99889d-tq44h\" (UID: \"9ba0d9e5-f1ab-40a6-9490-57ce8566843a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.710612 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9ba0d9e5-f1ab-40a6-9490-57ce8566843a-manager-config\") pod \"loki-operator-controller-manager-6bfc99889d-tq44h\" (UID: \"9ba0d9e5-f1ab-40a6-9490-57ce8566843a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.714840 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ba0d9e5-f1ab-40a6-9490-57ce8566843a-apiservice-cert\") pod \"loki-operator-controller-manager-6bfc99889d-tq44h\" (UID: \"9ba0d9e5-f1ab-40a6-9490-57ce8566843a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.715601 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ba0d9e5-f1ab-40a6-9490-57ce8566843a-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6bfc99889d-tq44h\" (UID: \"9ba0d9e5-f1ab-40a6-9490-57ce8566843a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.719823 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ba0d9e5-f1ab-40a6-9490-57ce8566843a-webhook-cert\") pod \"loki-operator-controller-manager-6bfc99889d-tq44h\" (UID: \"9ba0d9e5-f1ab-40a6-9490-57ce8566843a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.729214 4776 generic.go:334] "Generic (PLEG): container finished" podID="20cd1aea-6a8d-458a-8697-f9193cfa6058" containerID="c2e3f53424b5f8515e6643197bad2130a24f54d60d5cc815ac3cbb1ac880e781" exitCode=0 Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.729261 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" event={"ID":"20cd1aea-6a8d-458a-8697-f9193cfa6058","Type":"ContainerDied","Data":"c2e3f53424b5f8515e6643197bad2130a24f54d60d5cc815ac3cbb1ac880e781"} Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.736864 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzwhz\" (UniqueName: \"kubernetes.io/projected/9ba0d9e5-f1ab-40a6-9490-57ce8566843a-kube-api-access-vzwhz\") pod \"loki-operator-controller-manager-6bfc99889d-tq44h\" (UID: \"9ba0d9e5-f1ab-40a6-9490-57ce8566843a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:36 crc kubenswrapper[4776]: I1208 09:13:36.888120 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:37 crc kubenswrapper[4776]: I1208 09:13:37.215336 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h"] Dec 08 09:13:37 crc kubenswrapper[4776]: W1208 09:13:37.216909 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ba0d9e5_f1ab_40a6_9490_57ce8566843a.slice/crio-b6d5c16a5fdec92f9faf55885ac66758c3c5217658f9eedff0228f3944e29c5d WatchSource:0}: Error finding container b6d5c16a5fdec92f9faf55885ac66758c3c5217658f9eedff0228f3944e29c5d: Status 404 returned error can't find the container with id b6d5c16a5fdec92f9faf55885ac66758c3c5217658f9eedff0228f3944e29c5d Dec 08 09:13:37 crc kubenswrapper[4776]: I1208 09:13:37.736040 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" event={"ID":"9ba0d9e5-f1ab-40a6-9490-57ce8566843a","Type":"ContainerStarted","Data":"b6d5c16a5fdec92f9faf55885ac66758c3c5217658f9eedff0228f3944e29c5d"} Dec 08 09:13:38 crc kubenswrapper[4776]: I1208 09:13:37.997702 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" Dec 08 09:13:38 crc kubenswrapper[4776]: I1208 09:13:38.026703 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20cd1aea-6a8d-458a-8697-f9193cfa6058-util\") pod \"20cd1aea-6a8d-458a-8697-f9193cfa6058\" (UID: \"20cd1aea-6a8d-458a-8697-f9193cfa6058\") " Dec 08 09:13:38 crc kubenswrapper[4776]: I1208 09:13:38.026752 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20cd1aea-6a8d-458a-8697-f9193cfa6058-bundle\") pod \"20cd1aea-6a8d-458a-8697-f9193cfa6058\" (UID: \"20cd1aea-6a8d-458a-8697-f9193cfa6058\") " Dec 08 09:13:38 crc kubenswrapper[4776]: I1208 09:13:38.026874 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wljwv\" (UniqueName: \"kubernetes.io/projected/20cd1aea-6a8d-458a-8697-f9193cfa6058-kube-api-access-wljwv\") pod \"20cd1aea-6a8d-458a-8697-f9193cfa6058\" (UID: \"20cd1aea-6a8d-458a-8697-f9193cfa6058\") " Dec 08 09:13:38 crc kubenswrapper[4776]: I1208 09:13:38.027702 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20cd1aea-6a8d-458a-8697-f9193cfa6058-bundle" (OuterVolumeSpecName: "bundle") pod "20cd1aea-6a8d-458a-8697-f9193cfa6058" (UID: "20cd1aea-6a8d-458a-8697-f9193cfa6058"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:13:38 crc kubenswrapper[4776]: I1208 09:13:38.032302 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20cd1aea-6a8d-458a-8697-f9193cfa6058-kube-api-access-wljwv" (OuterVolumeSpecName: "kube-api-access-wljwv") pod "20cd1aea-6a8d-458a-8697-f9193cfa6058" (UID: "20cd1aea-6a8d-458a-8697-f9193cfa6058"). InnerVolumeSpecName "kube-api-access-wljwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:13:38 crc kubenswrapper[4776]: I1208 09:13:38.037699 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20cd1aea-6a8d-458a-8697-f9193cfa6058-util" (OuterVolumeSpecName: "util") pod "20cd1aea-6a8d-458a-8697-f9193cfa6058" (UID: "20cd1aea-6a8d-458a-8697-f9193cfa6058"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:13:38 crc kubenswrapper[4776]: I1208 09:13:38.128140 4776 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20cd1aea-6a8d-458a-8697-f9193cfa6058-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:13:38 crc kubenswrapper[4776]: I1208 09:13:38.128193 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wljwv\" (UniqueName: \"kubernetes.io/projected/20cd1aea-6a8d-458a-8697-f9193cfa6058-kube-api-access-wljwv\") on node \"crc\" DevicePath \"\"" Dec 08 09:13:38 crc kubenswrapper[4776]: I1208 09:13:38.128207 4776 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20cd1aea-6a8d-458a-8697-f9193cfa6058-util\") on node \"crc\" DevicePath \"\"" Dec 08 09:13:38 crc kubenswrapper[4776]: I1208 09:13:38.763309 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" event={"ID":"20cd1aea-6a8d-458a-8697-f9193cfa6058","Type":"ContainerDied","Data":"0078d5a54c587a1236b97f4b733878dc02471fac47d3dc96f2b949ed4d4ade55"} Dec 08 09:13:38 crc kubenswrapper[4776]: I1208 09:13:38.763351 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0078d5a54c587a1236b97f4b733878dc02471fac47d3dc96f2b949ed4d4ade55" Dec 08 09:13:38 crc kubenswrapper[4776]: I1208 09:13:38.764287 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww" Dec 08 09:13:43 crc kubenswrapper[4776]: I1208 09:13:43.793617 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" event={"ID":"9ba0d9e5-f1ab-40a6-9490-57ce8566843a","Type":"ContainerStarted","Data":"ee8a0659d44d1aeb8be3930b1f86de19aefa043063f5afd7a63a62c886314561"} Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.042761 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-glvzn"] Dec 08 09:13:45 crc kubenswrapper[4776]: E1208 09:13:45.043476 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cd1aea-6a8d-458a-8697-f9193cfa6058" containerName="extract" Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.043490 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cd1aea-6a8d-458a-8697-f9193cfa6058" containerName="extract" Dec 08 09:13:45 crc kubenswrapper[4776]: E1208 09:13:45.043510 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cd1aea-6a8d-458a-8697-f9193cfa6058" containerName="pull" Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.043518 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cd1aea-6a8d-458a-8697-f9193cfa6058" containerName="pull" Dec 08 09:13:45 crc kubenswrapper[4776]: E1208 09:13:45.043531 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cd1aea-6a8d-458a-8697-f9193cfa6058" containerName="util" Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.043538 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cd1aea-6a8d-458a-8697-f9193cfa6058" containerName="util" Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.043704 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="20cd1aea-6a8d-458a-8697-f9193cfa6058" containerName="extract" Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.044758 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.048871 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-glvzn"] Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.235793 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxn9p\" (UniqueName: \"kubernetes.io/projected/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-kube-api-access-wxn9p\") pod \"certified-operators-glvzn\" (UID: \"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8\") " pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.235872 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-utilities\") pod \"certified-operators-glvzn\" (UID: \"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8\") " pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.235925 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-catalog-content\") pod \"certified-operators-glvzn\" (UID: \"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8\") " pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.337659 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxn9p\" (UniqueName: \"kubernetes.io/projected/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-kube-api-access-wxn9p\") pod \"certified-operators-glvzn\" (UID: \"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8\") " pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.338187 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-utilities\") pod \"certified-operators-glvzn\" (UID: \"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8\") " pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.338687 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-utilities\") pod \"certified-operators-glvzn\" (UID: \"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8\") " pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.338753 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-catalog-content\") pod \"certified-operators-glvzn\" (UID: \"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8\") " pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.339154 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-catalog-content\") pod \"certified-operators-glvzn\" (UID: \"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8\") " pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.356258 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxn9p\" (UniqueName: \"kubernetes.io/projected/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-kube-api-access-wxn9p\") pod \"certified-operators-glvzn\" (UID: \"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8\") " pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:13:45 crc kubenswrapper[4776]: I1208 09:13:45.362667 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:13:49 crc kubenswrapper[4776]: I1208 09:13:49.461205 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-glvzn"] Dec 08 09:13:49 crc kubenswrapper[4776]: I1208 09:13:49.867750 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" event={"ID":"9ba0d9e5-f1ab-40a6-9490-57ce8566843a","Type":"ContainerStarted","Data":"33cbeb4de7220d47e1df45a0fee4de0bad87d2fe3dc145f877fa608f38b39fab"} Dec 08 09:13:49 crc kubenswrapper[4776]: I1208 09:13:49.868519 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:49 crc kubenswrapper[4776]: I1208 09:13:49.876726 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" Dec 08 09:13:49 crc kubenswrapper[4776]: I1208 09:13:49.880902 4776 generic.go:334] "Generic (PLEG): container finished" podID="b5d1f89e-cbce-4732-852b-d6a5c7ac9df8" containerID="30ae26813349d9c8166559a4dcad1684bec619d754f993c7f395812af54572a4" exitCode=0 Dec 08 09:13:49 crc kubenswrapper[4776]: I1208 09:13:49.880952 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glvzn" event={"ID":"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8","Type":"ContainerDied","Data":"30ae26813349d9c8166559a4dcad1684bec619d754f993c7f395812af54572a4"} Dec 08 09:13:49 crc kubenswrapper[4776]: I1208 09:13:49.880982 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glvzn" event={"ID":"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8","Type":"ContainerStarted","Data":"fdc6dd45eca3f433d6c935b01ed2e141738f289a652f7929253e0113f5255874"} Dec 08 09:13:49 crc kubenswrapper[4776]: I1208 09:13:49.916166 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-6bfc99889d-tq44h" podStartSLOduration=1.846749575 podStartE2EDuration="13.916151204s" podCreationTimestamp="2025-12-08 09:13:36 +0000 UTC" firstStartedPulling="2025-12-08 09:13:37.224700641 +0000 UTC m=+893.487925663" lastFinishedPulling="2025-12-08 09:13:49.29410228 +0000 UTC m=+905.557327292" observedRunningTime="2025-12-08 09:13:49.912906786 +0000 UTC m=+906.176131808" watchObservedRunningTime="2025-12-08 09:13:49.916151204 +0000 UTC m=+906.179376226" Dec 08 09:13:50 crc kubenswrapper[4776]: I1208 09:13:50.065531 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-wwf9p"] Dec 08 09:13:50 crc kubenswrapper[4776]: I1208 09:13:50.066303 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-wwf9p" Dec 08 09:13:50 crc kubenswrapper[4776]: I1208 09:13:50.067854 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Dec 08 09:13:50 crc kubenswrapper[4776]: I1208 09:13:50.068097 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-bms47" Dec 08 09:13:50 crc kubenswrapper[4776]: I1208 09:13:50.068384 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Dec 08 09:13:50 crc kubenswrapper[4776]: I1208 09:13:50.081237 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-wwf9p"] Dec 08 09:13:50 crc kubenswrapper[4776]: I1208 09:13:50.229581 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmq7x\" (UniqueName: \"kubernetes.io/projected/3e4917d5-3292-4cd8-b001-6d6bf5609def-kube-api-access-pmq7x\") pod \"cluster-logging-operator-ff9846bd-wwf9p\" (UID: \"3e4917d5-3292-4cd8-b001-6d6bf5609def\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-wwf9p" Dec 08 09:13:50 crc kubenswrapper[4776]: I1208 09:13:50.331373 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmq7x\" (UniqueName: \"kubernetes.io/projected/3e4917d5-3292-4cd8-b001-6d6bf5609def-kube-api-access-pmq7x\") pod \"cluster-logging-operator-ff9846bd-wwf9p\" (UID: \"3e4917d5-3292-4cd8-b001-6d6bf5609def\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-wwf9p" Dec 08 09:13:50 crc kubenswrapper[4776]: I1208 09:13:50.355801 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmq7x\" (UniqueName: \"kubernetes.io/projected/3e4917d5-3292-4cd8-b001-6d6bf5609def-kube-api-access-pmq7x\") pod \"cluster-logging-operator-ff9846bd-wwf9p\" (UID: \"3e4917d5-3292-4cd8-b001-6d6bf5609def\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-wwf9p" Dec 08 09:13:50 crc kubenswrapper[4776]: I1208 09:13:50.384951 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-wwf9p" Dec 08 09:13:50 crc kubenswrapper[4776]: I1208 09:13:50.649869 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-wwf9p"] Dec 08 09:13:50 crc kubenswrapper[4776]: W1208 09:13:50.655124 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e4917d5_3292_4cd8_b001_6d6bf5609def.slice/crio-4acdba6fa74ab2ecc1697552339863198bb1d85b176610fc074e65515f928588 WatchSource:0}: Error finding container 4acdba6fa74ab2ecc1697552339863198bb1d85b176610fc074e65515f928588: Status 404 returned error can't find the container with id 4acdba6fa74ab2ecc1697552339863198bb1d85b176610fc074e65515f928588 Dec 08 09:13:50 crc kubenswrapper[4776]: I1208 09:13:50.886761 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-wwf9p" event={"ID":"3e4917d5-3292-4cd8-b001-6d6bf5609def","Type":"ContainerStarted","Data":"4acdba6fa74ab2ecc1697552339863198bb1d85b176610fc074e65515f928588"} Dec 08 09:13:51 crc kubenswrapper[4776]: I1208 09:13:51.895457 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glvzn" event={"ID":"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8","Type":"ContainerStarted","Data":"ce0717a1418de3b393c665cb5829296753f6d08e668ed2e310dc33051829fed4"} Dec 08 09:13:52 crc kubenswrapper[4776]: I1208 09:13:52.903800 4776 generic.go:334] "Generic (PLEG): container finished" podID="b5d1f89e-cbce-4732-852b-d6a5c7ac9df8" containerID="ce0717a1418de3b393c665cb5829296753f6d08e668ed2e310dc33051829fed4" exitCode=0 Dec 08 09:13:52 crc kubenswrapper[4776]: I1208 09:13:52.903838 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glvzn" event={"ID":"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8","Type":"ContainerDied","Data":"ce0717a1418de3b393c665cb5829296753f6d08e668ed2e310dc33051829fed4"} Dec 08 09:13:53 crc kubenswrapper[4776]: I1208 09:13:53.913992 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glvzn" event={"ID":"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8","Type":"ContainerStarted","Data":"3d7b64e27373e4eeaff3c6987e5d7d29840d3e2d6c3e1ee89631cf231d613c90"} Dec 08 09:13:53 crc kubenswrapper[4776]: I1208 09:13:53.936053 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-glvzn" podStartSLOduration=5.531085186 podStartE2EDuration="8.936035413s" podCreationTimestamp="2025-12-08 09:13:45 +0000 UTC" firstStartedPulling="2025-12-08 09:13:49.884943472 +0000 UTC m=+906.148168494" lastFinishedPulling="2025-12-08 09:13:53.289893699 +0000 UTC m=+909.553118721" observedRunningTime="2025-12-08 09:13:53.929915938 +0000 UTC m=+910.193140960" watchObservedRunningTime="2025-12-08 09:13:53.936035413 +0000 UTC m=+910.199260435" Dec 08 09:13:55 crc kubenswrapper[4776]: I1208 09:13:55.365325 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:13:55 crc kubenswrapper[4776]: I1208 09:13:55.365547 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:13:55 crc kubenswrapper[4776]: I1208 09:13:55.407103 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:13:55 crc kubenswrapper[4776]: I1208 09:13:55.926382 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-wwf9p" event={"ID":"3e4917d5-3292-4cd8-b001-6d6bf5609def","Type":"ContainerStarted","Data":"969d81e227a1ffaaeee967e2b37f336a5bfd4dfdf3f5f2782414d38d344e65ed"} Dec 08 09:13:55 crc kubenswrapper[4776]: I1208 09:13:55.941929 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-ff9846bd-wwf9p" podStartSLOduration=1.263210649 podStartE2EDuration="5.941909273s" podCreationTimestamp="2025-12-08 09:13:50 +0000 UTC" firstStartedPulling="2025-12-08 09:13:50.657367203 +0000 UTC m=+906.920592225" lastFinishedPulling="2025-12-08 09:13:55.336065827 +0000 UTC m=+911.599290849" observedRunningTime="2025-12-08 09:13:55.941526092 +0000 UTC m=+912.204751114" watchObservedRunningTime="2025-12-08 09:13:55.941909273 +0000 UTC m=+912.205134285" Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.025607 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z6frj"] Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.027355 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.040429 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z6frj"] Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.169913 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f8de8d8-d342-49bd-9b86-631dd33282b6-utilities\") pod \"community-operators-z6frj\" (UID: \"1f8de8d8-d342-49bd-9b86-631dd33282b6\") " pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.169986 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csxnb\" (UniqueName: \"kubernetes.io/projected/1f8de8d8-d342-49bd-9b86-631dd33282b6-kube-api-access-csxnb\") pod \"community-operators-z6frj\" (UID: \"1f8de8d8-d342-49bd-9b86-631dd33282b6\") " pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.170008 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f8de8d8-d342-49bd-9b86-631dd33282b6-catalog-content\") pod \"community-operators-z6frj\" (UID: \"1f8de8d8-d342-49bd-9b86-631dd33282b6\") " pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.271353 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f8de8d8-d342-49bd-9b86-631dd33282b6-utilities\") pod \"community-operators-z6frj\" (UID: \"1f8de8d8-d342-49bd-9b86-631dd33282b6\") " pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.271435 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csxnb\" (UniqueName: \"kubernetes.io/projected/1f8de8d8-d342-49bd-9b86-631dd33282b6-kube-api-access-csxnb\") pod \"community-operators-z6frj\" (UID: \"1f8de8d8-d342-49bd-9b86-631dd33282b6\") " pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.271466 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f8de8d8-d342-49bd-9b86-631dd33282b6-catalog-content\") pod \"community-operators-z6frj\" (UID: \"1f8de8d8-d342-49bd-9b86-631dd33282b6\") " pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.271870 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f8de8d8-d342-49bd-9b86-631dd33282b6-utilities\") pod \"community-operators-z6frj\" (UID: \"1f8de8d8-d342-49bd-9b86-631dd33282b6\") " pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.272019 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f8de8d8-d342-49bd-9b86-631dd33282b6-catalog-content\") pod \"community-operators-z6frj\" (UID: \"1f8de8d8-d342-49bd-9b86-631dd33282b6\") " pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.292270 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csxnb\" (UniqueName: \"kubernetes.io/projected/1f8de8d8-d342-49bd-9b86-631dd33282b6-kube-api-access-csxnb\") pod \"community-operators-z6frj\" (UID: \"1f8de8d8-d342-49bd-9b86-631dd33282b6\") " pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.346654 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.620480 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z6frj"] Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.950053 4776 generic.go:334] "Generic (PLEG): container finished" podID="1f8de8d8-d342-49bd-9b86-631dd33282b6" containerID="b597a4c00fd7e7aa18a5fb0913f4689d111d9eced86e43d6362e4ae6c1e517dd" exitCode=0 Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.950089 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6frj" event={"ID":"1f8de8d8-d342-49bd-9b86-631dd33282b6","Type":"ContainerDied","Data":"b597a4c00fd7e7aa18a5fb0913f4689d111d9eced86e43d6362e4ae6c1e517dd"} Dec 08 09:13:59 crc kubenswrapper[4776]: I1208 09:13:59.950375 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6frj" event={"ID":"1f8de8d8-d342-49bd-9b86-631dd33282b6","Type":"ContainerStarted","Data":"cd413c55293438e2afe263219c2b092802f37c2c9999d6ba86bd5645e5447a26"} Dec 08 09:14:00 crc kubenswrapper[4776]: I1208 09:14:00.872922 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Dec 08 09:14:00 crc kubenswrapper[4776]: I1208 09:14:00.873984 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 08 09:14:00 crc kubenswrapper[4776]: I1208 09:14:00.875733 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Dec 08 09:14:00 crc kubenswrapper[4776]: I1208 09:14:00.876115 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Dec 08 09:14:00 crc kubenswrapper[4776]: I1208 09:14:00.881789 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 08 09:14:00 crc kubenswrapper[4776]: I1208 09:14:00.995093 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a5175577-9c07-47d3-a62d-cbcedb5cc27a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5175577-9c07-47d3-a62d-cbcedb5cc27a\") pod \"minio\" (UID: \"61ff5f66-7733-4770-88fc-28f9344a73f9\") " pod="minio-dev/minio" Dec 08 09:14:00 crc kubenswrapper[4776]: I1208 09:14:00.995244 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5twxt\" (UniqueName: \"kubernetes.io/projected/61ff5f66-7733-4770-88fc-28f9344a73f9-kube-api-access-5twxt\") pod \"minio\" (UID: \"61ff5f66-7733-4770-88fc-28f9344a73f9\") " pod="minio-dev/minio" Dec 08 09:14:01 crc kubenswrapper[4776]: I1208 09:14:01.096291 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5twxt\" (UniqueName: \"kubernetes.io/projected/61ff5f66-7733-4770-88fc-28f9344a73f9-kube-api-access-5twxt\") pod \"minio\" (UID: \"61ff5f66-7733-4770-88fc-28f9344a73f9\") " pod="minio-dev/minio" Dec 08 09:14:01 crc kubenswrapper[4776]: I1208 09:14:01.096409 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a5175577-9c07-47d3-a62d-cbcedb5cc27a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5175577-9c07-47d3-a62d-cbcedb5cc27a\") pod \"minio\" (UID: \"61ff5f66-7733-4770-88fc-28f9344a73f9\") " pod="minio-dev/minio" Dec 08 09:14:01 crc kubenswrapper[4776]: I1208 09:14:01.099154 4776 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 08 09:14:01 crc kubenswrapper[4776]: I1208 09:14:01.099211 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a5175577-9c07-47d3-a62d-cbcedb5cc27a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5175577-9c07-47d3-a62d-cbcedb5cc27a\") pod \"minio\" (UID: \"61ff5f66-7733-4770-88fc-28f9344a73f9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/289f614657ffd70c7322cde5a5ab5e6cbd18e2556ec29d78e4c32493204becc1/globalmount\"" pod="minio-dev/minio" Dec 08 09:14:01 crc kubenswrapper[4776]: I1208 09:14:01.118287 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5twxt\" (UniqueName: \"kubernetes.io/projected/61ff5f66-7733-4770-88fc-28f9344a73f9-kube-api-access-5twxt\") pod \"minio\" (UID: \"61ff5f66-7733-4770-88fc-28f9344a73f9\") " pod="minio-dev/minio" Dec 08 09:14:01 crc kubenswrapper[4776]: I1208 09:14:01.122745 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a5175577-9c07-47d3-a62d-cbcedb5cc27a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5175577-9c07-47d3-a62d-cbcedb5cc27a\") pod \"minio\" (UID: \"61ff5f66-7733-4770-88fc-28f9344a73f9\") " pod="minio-dev/minio" Dec 08 09:14:01 crc kubenswrapper[4776]: I1208 09:14:01.187814 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 08 09:14:01 crc kubenswrapper[4776]: I1208 09:14:01.643935 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 08 09:14:01 crc kubenswrapper[4776]: W1208 09:14:01.658453 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61ff5f66_7733_4770_88fc_28f9344a73f9.slice/crio-9ca51f33e5755626bf295d77caa4fcb3194dd3db24c5a040e42207e334322351 WatchSource:0}: Error finding container 9ca51f33e5755626bf295d77caa4fcb3194dd3db24c5a040e42207e334322351: Status 404 returned error can't find the container with id 9ca51f33e5755626bf295d77caa4fcb3194dd3db24c5a040e42207e334322351 Dec 08 09:14:01 crc kubenswrapper[4776]: I1208 09:14:01.977461 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6frj" event={"ID":"1f8de8d8-d342-49bd-9b86-631dd33282b6","Type":"ContainerStarted","Data":"ae583db904253c21579a4e1b9497a3e9196a555e93b6dca0667fafea0da45180"} Dec 08 09:14:01 crc kubenswrapper[4776]: I1208 09:14:01.978487 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"61ff5f66-7733-4770-88fc-28f9344a73f9","Type":"ContainerStarted","Data":"9ca51f33e5755626bf295d77caa4fcb3194dd3db24c5a040e42207e334322351"} Dec 08 09:14:02 crc kubenswrapper[4776]: I1208 09:14:02.418969 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kgfhw"] Dec 08 09:14:02 crc kubenswrapper[4776]: I1208 09:14:02.420616 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:02 crc kubenswrapper[4776]: I1208 09:14:02.428867 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgfhw"] Dec 08 09:14:02 crc kubenswrapper[4776]: I1208 09:14:02.621322 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e6333e-c8a3-430c-8cf9-c51ee109a164-catalog-content\") pod \"redhat-marketplace-kgfhw\" (UID: \"c9e6333e-c8a3-430c-8cf9-c51ee109a164\") " pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:02 crc kubenswrapper[4776]: I1208 09:14:02.621371 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvvpm\" (UniqueName: \"kubernetes.io/projected/c9e6333e-c8a3-430c-8cf9-c51ee109a164-kube-api-access-tvvpm\") pod \"redhat-marketplace-kgfhw\" (UID: \"c9e6333e-c8a3-430c-8cf9-c51ee109a164\") " pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:02 crc kubenswrapper[4776]: I1208 09:14:02.621435 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e6333e-c8a3-430c-8cf9-c51ee109a164-utilities\") pod \"redhat-marketplace-kgfhw\" (UID: \"c9e6333e-c8a3-430c-8cf9-c51ee109a164\") " pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:02 crc kubenswrapper[4776]: I1208 09:14:02.722438 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e6333e-c8a3-430c-8cf9-c51ee109a164-utilities\") pod \"redhat-marketplace-kgfhw\" (UID: \"c9e6333e-c8a3-430c-8cf9-c51ee109a164\") " pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:02 crc kubenswrapper[4776]: I1208 09:14:02.722560 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e6333e-c8a3-430c-8cf9-c51ee109a164-catalog-content\") pod \"redhat-marketplace-kgfhw\" (UID: \"c9e6333e-c8a3-430c-8cf9-c51ee109a164\") " pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:02 crc kubenswrapper[4776]: I1208 09:14:02.722599 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvvpm\" (UniqueName: \"kubernetes.io/projected/c9e6333e-c8a3-430c-8cf9-c51ee109a164-kube-api-access-tvvpm\") pod \"redhat-marketplace-kgfhw\" (UID: \"c9e6333e-c8a3-430c-8cf9-c51ee109a164\") " pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:02 crc kubenswrapper[4776]: I1208 09:14:02.723087 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e6333e-c8a3-430c-8cf9-c51ee109a164-utilities\") pod \"redhat-marketplace-kgfhw\" (UID: \"c9e6333e-c8a3-430c-8cf9-c51ee109a164\") " pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:02 crc kubenswrapper[4776]: I1208 09:14:02.723146 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e6333e-c8a3-430c-8cf9-c51ee109a164-catalog-content\") pod \"redhat-marketplace-kgfhw\" (UID: \"c9e6333e-c8a3-430c-8cf9-c51ee109a164\") " pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:02 crc kubenswrapper[4776]: I1208 09:14:02.752711 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvvpm\" (UniqueName: \"kubernetes.io/projected/c9e6333e-c8a3-430c-8cf9-c51ee109a164-kube-api-access-tvvpm\") pod \"redhat-marketplace-kgfhw\" (UID: \"c9e6333e-c8a3-430c-8cf9-c51ee109a164\") " pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:02 crc kubenswrapper[4776]: I1208 09:14:02.793828 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:02 crc kubenswrapper[4776]: I1208 09:14:02.990494 4776 generic.go:334] "Generic (PLEG): container finished" podID="1f8de8d8-d342-49bd-9b86-631dd33282b6" containerID="ae583db904253c21579a4e1b9497a3e9196a555e93b6dca0667fafea0da45180" exitCode=0 Dec 08 09:14:02 crc kubenswrapper[4776]: I1208 09:14:02.990813 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6frj" event={"ID":"1f8de8d8-d342-49bd-9b86-631dd33282b6","Type":"ContainerDied","Data":"ae583db904253c21579a4e1b9497a3e9196a555e93b6dca0667fafea0da45180"} Dec 08 09:14:03 crc kubenswrapper[4776]: I1208 09:14:03.273500 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgfhw"] Dec 08 09:14:03 crc kubenswrapper[4776]: W1208 09:14:03.297341 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9e6333e_c8a3_430c_8cf9_c51ee109a164.slice/crio-a8df582a99023c6e7b9d1f6ef73677591a9d2372d42c4cd9ecee462f01c7bf3e WatchSource:0}: Error finding container a8df582a99023c6e7b9d1f6ef73677591a9d2372d42c4cd9ecee462f01c7bf3e: Status 404 returned error can't find the container with id a8df582a99023c6e7b9d1f6ef73677591a9d2372d42c4cd9ecee462f01c7bf3e Dec 08 09:14:03 crc kubenswrapper[4776]: I1208 09:14:03.998410 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgfhw" event={"ID":"c9e6333e-c8a3-430c-8cf9-c51ee109a164","Type":"ContainerStarted","Data":"a8df582a99023c6e7b9d1f6ef73677591a9d2372d42c4cd9ecee462f01c7bf3e"} Dec 08 09:14:05 crc kubenswrapper[4776]: I1208 09:14:05.401287 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:14:06 crc kubenswrapper[4776]: I1208 09:14:06.015257 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9e6333e-c8a3-430c-8cf9-c51ee109a164" containerID="0f501c5464d542b3c85440950e14c2886d9207ad683bacb42ee3b23587f4de1d" exitCode=0 Dec 08 09:14:06 crc kubenswrapper[4776]: I1208 09:14:06.015328 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgfhw" event={"ID":"c9e6333e-c8a3-430c-8cf9-c51ee109a164","Type":"ContainerDied","Data":"0f501c5464d542b3c85440950e14c2886d9207ad683bacb42ee3b23587f4de1d"} Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.036700 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"61ff5f66-7733-4770-88fc-28f9344a73f9","Type":"ContainerStarted","Data":"cb90cc396d092e38fcf7b8ead91eae88b7bbe6d1a6b4684434810bc84005e895"} Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.039094 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6frj" event={"ID":"1f8de8d8-d342-49bd-9b86-631dd33282b6","Type":"ContainerStarted","Data":"2cc496802e39d36f585d9bd017dad7bb9de08db666503180027ec0f559ef14a0"} Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.041072 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9e6333e-c8a3-430c-8cf9-c51ee109a164" containerID="4ad4cba424382dd207f6c5f37e27545d4652f605ce78437f3fe7729a5e008bd7" exitCode=0 Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.041113 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgfhw" event={"ID":"c9e6333e-c8a3-430c-8cf9-c51ee109a164","Type":"ContainerDied","Data":"4ad4cba424382dd207f6c5f37e27545d4652f605ce78437f3fe7729a5e008bd7"} Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.057521 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.658173952 podStartE2EDuration="11.057498291s" podCreationTimestamp="2025-12-08 09:13:58 +0000 UTC" firstStartedPulling="2025-12-08 09:14:01.667745959 +0000 UTC m=+917.930970981" lastFinishedPulling="2025-12-08 09:14:08.067070278 +0000 UTC m=+924.330295320" observedRunningTime="2025-12-08 09:14:09.052388583 +0000 UTC m=+925.315613605" watchObservedRunningTime="2025-12-08 09:14:09.057498291 +0000 UTC m=+925.320723313" Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.094296 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z6frj" podStartSLOduration=2.081482981 podStartE2EDuration="10.094274473s" podCreationTimestamp="2025-12-08 09:13:59 +0000 UTC" firstStartedPulling="2025-12-08 09:13:59.951577796 +0000 UTC m=+916.214802818" lastFinishedPulling="2025-12-08 09:14:07.964369278 +0000 UTC m=+924.227594310" observedRunningTime="2025-12-08 09:14:09.090345068 +0000 UTC m=+925.353570110" watchObservedRunningTime="2025-12-08 09:14:09.094274473 +0000 UTC m=+925.357499505" Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.209869 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-glvzn"] Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.210094 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-glvzn" podUID="b5d1f89e-cbce-4732-852b-d6a5c7ac9df8" containerName="registry-server" containerID="cri-o://3d7b64e27373e4eeaff3c6987e5d7d29840d3e2d6c3e1ee89631cf231d613c90" gracePeriod=2 Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.346991 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.347036 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.624125 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.724056 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-utilities\") pod \"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8\" (UID: \"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8\") " Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.724347 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-catalog-content\") pod \"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8\" (UID: \"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8\") " Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.724425 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxn9p\" (UniqueName: \"kubernetes.io/projected/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-kube-api-access-wxn9p\") pod \"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8\" (UID: \"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8\") " Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.725348 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-utilities" (OuterVolumeSpecName: "utilities") pod "b5d1f89e-cbce-4732-852b-d6a5c7ac9df8" (UID: "b5d1f89e-cbce-4732-852b-d6a5c7ac9df8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.745397 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-kube-api-access-wxn9p" (OuterVolumeSpecName: "kube-api-access-wxn9p") pod "b5d1f89e-cbce-4732-852b-d6a5c7ac9df8" (UID: "b5d1f89e-cbce-4732-852b-d6a5c7ac9df8"). InnerVolumeSpecName "kube-api-access-wxn9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.777203 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5d1f89e-cbce-4732-852b-d6a5c7ac9df8" (UID: "b5d1f89e-cbce-4732-852b-d6a5c7ac9df8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.829342 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.829373 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:14:09 crc kubenswrapper[4776]: I1208 09:14:09.829384 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxn9p\" (UniqueName: \"kubernetes.io/projected/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8-kube-api-access-wxn9p\") on node \"crc\" DevicePath \"\"" Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.051109 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgfhw" event={"ID":"c9e6333e-c8a3-430c-8cf9-c51ee109a164","Type":"ContainerStarted","Data":"83577b6a40faa3652f4a81c9995618b81e0b35db8f8aa2d4e20fb498102967b8"} Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.053279 4776 generic.go:334] "Generic (PLEG): container finished" podID="b5d1f89e-cbce-4732-852b-d6a5c7ac9df8" containerID="3d7b64e27373e4eeaff3c6987e5d7d29840d3e2d6c3e1ee89631cf231d613c90" exitCode=0 Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.053347 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glvzn" Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.053382 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glvzn" event={"ID":"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8","Type":"ContainerDied","Data":"3d7b64e27373e4eeaff3c6987e5d7d29840d3e2d6c3e1ee89631cf231d613c90"} Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.053437 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glvzn" event={"ID":"b5d1f89e-cbce-4732-852b-d6a5c7ac9df8","Type":"ContainerDied","Data":"fdc6dd45eca3f433d6c935b01ed2e141738f289a652f7929253e0113f5255874"} Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.053465 4776 scope.go:117] "RemoveContainer" containerID="3d7b64e27373e4eeaff3c6987e5d7d29840d3e2d6c3e1ee89631cf231d613c90" Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.069322 4776 scope.go:117] "RemoveContainer" containerID="ce0717a1418de3b393c665cb5829296753f6d08e668ed2e310dc33051829fed4" Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.076374 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kgfhw" podStartSLOduration=4.654066334 podStartE2EDuration="8.07635817s" podCreationTimestamp="2025-12-08 09:14:02 +0000 UTC" firstStartedPulling="2025-12-08 09:14:06.109283966 +0000 UTC m=+922.372508988" lastFinishedPulling="2025-12-08 09:14:09.531575802 +0000 UTC m=+925.794800824" observedRunningTime="2025-12-08 09:14:10.072333502 +0000 UTC m=+926.335558534" watchObservedRunningTime="2025-12-08 09:14:10.07635817 +0000 UTC m=+926.339583192" Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.087354 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-glvzn"] Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.097439 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-glvzn"] Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.098297 4776 scope.go:117] "RemoveContainer" containerID="30ae26813349d9c8166559a4dcad1684bec619d754f993c7f395812af54572a4" Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.121603 4776 scope.go:117] "RemoveContainer" containerID="3d7b64e27373e4eeaff3c6987e5d7d29840d3e2d6c3e1ee89631cf231d613c90" Dec 08 09:14:10 crc kubenswrapper[4776]: E1208 09:14:10.122019 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d7b64e27373e4eeaff3c6987e5d7d29840d3e2d6c3e1ee89631cf231d613c90\": container with ID starting with 3d7b64e27373e4eeaff3c6987e5d7d29840d3e2d6c3e1ee89631cf231d613c90 not found: ID does not exist" containerID="3d7b64e27373e4eeaff3c6987e5d7d29840d3e2d6c3e1ee89631cf231d613c90" Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.122054 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7b64e27373e4eeaff3c6987e5d7d29840d3e2d6c3e1ee89631cf231d613c90"} err="failed to get container status \"3d7b64e27373e4eeaff3c6987e5d7d29840d3e2d6c3e1ee89631cf231d613c90\": rpc error: code = NotFound desc = could not find container \"3d7b64e27373e4eeaff3c6987e5d7d29840d3e2d6c3e1ee89631cf231d613c90\": container with ID starting with 3d7b64e27373e4eeaff3c6987e5d7d29840d3e2d6c3e1ee89631cf231d613c90 not found: ID does not exist" Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.122077 4776 scope.go:117] "RemoveContainer" containerID="ce0717a1418de3b393c665cb5829296753f6d08e668ed2e310dc33051829fed4" Dec 08 09:14:10 crc kubenswrapper[4776]: E1208 09:14:10.122354 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce0717a1418de3b393c665cb5829296753f6d08e668ed2e310dc33051829fed4\": container with ID starting with ce0717a1418de3b393c665cb5829296753f6d08e668ed2e310dc33051829fed4 not found: ID does not exist" containerID="ce0717a1418de3b393c665cb5829296753f6d08e668ed2e310dc33051829fed4" Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.122380 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce0717a1418de3b393c665cb5829296753f6d08e668ed2e310dc33051829fed4"} err="failed to get container status \"ce0717a1418de3b393c665cb5829296753f6d08e668ed2e310dc33051829fed4\": rpc error: code = NotFound desc = could not find container \"ce0717a1418de3b393c665cb5829296753f6d08e668ed2e310dc33051829fed4\": container with ID starting with ce0717a1418de3b393c665cb5829296753f6d08e668ed2e310dc33051829fed4 not found: ID does not exist" Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.122397 4776 scope.go:117] "RemoveContainer" containerID="30ae26813349d9c8166559a4dcad1684bec619d754f993c7f395812af54572a4" Dec 08 09:14:10 crc kubenswrapper[4776]: E1208 09:14:10.122812 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30ae26813349d9c8166559a4dcad1684bec619d754f993c7f395812af54572a4\": container with ID starting with 30ae26813349d9c8166559a4dcad1684bec619d754f993c7f395812af54572a4 not found: ID does not exist" containerID="30ae26813349d9c8166559a4dcad1684bec619d754f993c7f395812af54572a4" Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.122928 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30ae26813349d9c8166559a4dcad1684bec619d754f993c7f395812af54572a4"} err="failed to get container status \"30ae26813349d9c8166559a4dcad1684bec619d754f993c7f395812af54572a4\": rpc error: code = NotFound desc = could not find container \"30ae26813349d9c8166559a4dcad1684bec619d754f993c7f395812af54572a4\": container with ID starting with 30ae26813349d9c8166559a4dcad1684bec619d754f993c7f395812af54572a4 not found: ID does not exist" Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.352778 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5d1f89e-cbce-4732-852b-d6a5c7ac9df8" path="/var/lib/kubelet/pods/b5d1f89e-cbce-4732-852b-d6a5c7ac9df8/volumes" Dec 08 09:14:10 crc kubenswrapper[4776]: I1208 09:14:10.405302 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-z6frj" podUID="1f8de8d8-d342-49bd-9b86-631dd33282b6" containerName="registry-server" probeResult="failure" output=< Dec 08 09:14:10 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 09:14:10 crc kubenswrapper[4776]: > Dec 08 09:14:11 crc kubenswrapper[4776]: I1208 09:14:11.398745 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:14:11 crc kubenswrapper[4776]: I1208 09:14:11.399010 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:14:12 crc kubenswrapper[4776]: I1208 09:14:12.794458 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:12 crc kubenswrapper[4776]: I1208 09:14:12.794512 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:12 crc kubenswrapper[4776]: I1208 09:14:12.835237 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.152531 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-w6292"] Dec 08 09:14:13 crc kubenswrapper[4776]: E1208 09:14:13.153036 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d1f89e-cbce-4732-852b-d6a5c7ac9df8" containerName="extract-content" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.153048 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d1f89e-cbce-4732-852b-d6a5c7ac9df8" containerName="extract-content" Dec 08 09:14:13 crc kubenswrapper[4776]: E1208 09:14:13.153060 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d1f89e-cbce-4732-852b-d6a5c7ac9df8" containerName="extract-utilities" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.153066 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d1f89e-cbce-4732-852b-d6a5c7ac9df8" containerName="extract-utilities" Dec 08 09:14:13 crc kubenswrapper[4776]: E1208 09:14:13.153090 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d1f89e-cbce-4732-852b-d6a5c7ac9df8" containerName="registry-server" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.153096 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d1f89e-cbce-4732-852b-d6a5c7ac9df8" containerName="registry-server" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.153223 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5d1f89e-cbce-4732-852b-d6a5c7ac9df8" containerName="registry-server" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.153626 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.156073 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-k48fl" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.156359 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.156465 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.156602 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.160886 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.176962 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-w6292"] Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.288746 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j9pq\" (UniqueName: \"kubernetes.io/projected/9edcc5bd-cefb-4c32-89e3-24ff105358b2-kube-api-access-2j9pq\") pod \"logging-loki-distributor-76cc67bf56-w6292\" (UID: \"9edcc5bd-cefb-4c32-89e3-24ff105358b2\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.288799 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9edcc5bd-cefb-4c32-89e3-24ff105358b2-config\") pod \"logging-loki-distributor-76cc67bf56-w6292\" (UID: \"9edcc5bd-cefb-4c32-89e3-24ff105358b2\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.288852 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/9edcc5bd-cefb-4c32-89e3-24ff105358b2-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-w6292\" (UID: \"9edcc5bd-cefb-4c32-89e3-24ff105358b2\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.288885 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9edcc5bd-cefb-4c32-89e3-24ff105358b2-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-w6292\" (UID: \"9edcc5bd-cefb-4c32-89e3-24ff105358b2\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.288904 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/9edcc5bd-cefb-4c32-89e3-24ff105358b2-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-w6292\" (UID: \"9edcc5bd-cefb-4c32-89e3-24ff105358b2\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.334708 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-hwrdk"] Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.335571 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.337212 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.337295 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.337519 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.352453 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-hwrdk"] Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.389880 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9edcc5bd-cefb-4c32-89e3-24ff105358b2-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-w6292\" (UID: \"9edcc5bd-cefb-4c32-89e3-24ff105358b2\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.389916 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/9edcc5bd-cefb-4c32-89e3-24ff105358b2-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-w6292\" (UID: \"9edcc5bd-cefb-4c32-89e3-24ff105358b2\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.390002 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j9pq\" (UniqueName: \"kubernetes.io/projected/9edcc5bd-cefb-4c32-89e3-24ff105358b2-kube-api-access-2j9pq\") pod \"logging-loki-distributor-76cc67bf56-w6292\" (UID: \"9edcc5bd-cefb-4c32-89e3-24ff105358b2\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.390023 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9edcc5bd-cefb-4c32-89e3-24ff105358b2-config\") pod \"logging-loki-distributor-76cc67bf56-w6292\" (UID: \"9edcc5bd-cefb-4c32-89e3-24ff105358b2\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.390096 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/9edcc5bd-cefb-4c32-89e3-24ff105358b2-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-w6292\" (UID: \"9edcc5bd-cefb-4c32-89e3-24ff105358b2\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.391061 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9edcc5bd-cefb-4c32-89e3-24ff105358b2-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-w6292\" (UID: \"9edcc5bd-cefb-4c32-89e3-24ff105358b2\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.391127 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9edcc5bd-cefb-4c32-89e3-24ff105358b2-config\") pod \"logging-loki-distributor-76cc67bf56-w6292\" (UID: \"9edcc5bd-cefb-4c32-89e3-24ff105358b2\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.406319 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/9edcc5bd-cefb-4c32-89e3-24ff105358b2-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-w6292\" (UID: \"9edcc5bd-cefb-4c32-89e3-24ff105358b2\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.414869 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/9edcc5bd-cefb-4c32-89e3-24ff105358b2-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-w6292\" (UID: \"9edcc5bd-cefb-4c32-89e3-24ff105358b2\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.435909 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j9pq\" (UniqueName: \"kubernetes.io/projected/9edcc5bd-cefb-4c32-89e3-24ff105358b2-kube-api-access-2j9pq\") pod \"logging-loki-distributor-76cc67bf56-w6292\" (UID: \"9edcc5bd-cefb-4c32-89e3-24ff105358b2\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.480573 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.494845 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71eade59-504f-4431-8cd8-531883c1eba7-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.494930 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71eade59-504f-4431-8cd8-531883c1eba7-config\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.494947 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/71eade59-504f-4431-8cd8-531883c1eba7-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.494992 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6sz9\" (UniqueName: \"kubernetes.io/projected/71eade59-504f-4431-8cd8-531883c1eba7-kube-api-access-f6sz9\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.495012 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/71eade59-504f-4431-8cd8-531883c1eba7-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.495048 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/71eade59-504f-4431-8cd8-531883c1eba7-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.563695 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j"] Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.564491 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.569650 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.569964 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.575488 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j"] Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.597202 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71eade59-504f-4431-8cd8-531883c1eba7-config\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.597244 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/71eade59-504f-4431-8cd8-531883c1eba7-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.597286 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6sz9\" (UniqueName: \"kubernetes.io/projected/71eade59-504f-4431-8cd8-531883c1eba7-kube-api-access-f6sz9\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.597307 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/71eade59-504f-4431-8cd8-531883c1eba7-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.597337 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/71eade59-504f-4431-8cd8-531883c1eba7-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.597381 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71eade59-504f-4431-8cd8-531883c1eba7-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.599073 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71eade59-504f-4431-8cd8-531883c1eba7-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.603643 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71eade59-504f-4431-8cd8-531883c1eba7-config\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.611688 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/71eade59-504f-4431-8cd8-531883c1eba7-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.634995 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/71eade59-504f-4431-8cd8-531883c1eba7-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.635941 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6sz9\" (UniqueName: \"kubernetes.io/projected/71eade59-504f-4431-8cd8-531883c1eba7-kube-api-access-f6sz9\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.640700 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/71eade59-504f-4431-8cd8-531883c1eba7-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-hwrdk\" (UID: \"71eade59-504f-4431-8cd8-531883c1eba7\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.651584 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.699104 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2knp\" (UniqueName: \"kubernetes.io/projected/aed8f23a-7437-4eab-8dae-6ff17f9a5aa0-kube-api-access-t2knp\") pod \"logging-loki-query-frontend-84558f7c9f-fcz8j\" (UID: \"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.699203 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/aed8f23a-7437-4eab-8dae-6ff17f9a5aa0-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-fcz8j\" (UID: \"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.699239 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed8f23a-7437-4eab-8dae-6ff17f9a5aa0-config\") pod \"logging-loki-query-frontend-84558f7c9f-fcz8j\" (UID: \"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.699370 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/aed8f23a-7437-4eab-8dae-6ff17f9a5aa0-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-fcz8j\" (UID: \"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.700012 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aed8f23a-7437-4eab-8dae-6ff17f9a5aa0-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-fcz8j\" (UID: \"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.724654 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-54b997fdcc-nmww6"] Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.733318 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.739527 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.739600 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.739749 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.739856 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.740028 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.765357 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-54b997fdcc-nmww6"] Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.779549 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh"] Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.780708 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.790439 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-xdcl2" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.798141 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh"] Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.805024 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed8f23a-7437-4eab-8dae-6ff17f9a5aa0-config\") pod \"logging-loki-query-frontend-84558f7c9f-fcz8j\" (UID: \"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.805075 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/aed8f23a-7437-4eab-8dae-6ff17f9a5aa0-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-fcz8j\" (UID: \"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.805091 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aed8f23a-7437-4eab-8dae-6ff17f9a5aa0-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-fcz8j\" (UID: \"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.805164 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2knp\" (UniqueName: \"kubernetes.io/projected/aed8f23a-7437-4eab-8dae-6ff17f9a5aa0-kube-api-access-t2knp\") pod \"logging-loki-query-frontend-84558f7c9f-fcz8j\" (UID: \"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.805269 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/aed8f23a-7437-4eab-8dae-6ff17f9a5aa0-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-fcz8j\" (UID: \"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.806659 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed8f23a-7437-4eab-8dae-6ff17f9a5aa0-config\") pod \"logging-loki-query-frontend-84558f7c9f-fcz8j\" (UID: \"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.806611 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aed8f23a-7437-4eab-8dae-6ff17f9a5aa0-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-fcz8j\" (UID: \"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.816223 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/aed8f23a-7437-4eab-8dae-6ff17f9a5aa0-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-fcz8j\" (UID: \"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.821278 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/aed8f23a-7437-4eab-8dae-6ff17f9a5aa0-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-fcz8j\" (UID: \"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.824309 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2knp\" (UniqueName: \"kubernetes.io/projected/aed8f23a-7437-4eab-8dae-6ff17f9a5aa0-kube-api-access-t2knp\") pod \"logging-loki-query-frontend-84558f7c9f-fcz8j\" (UID: \"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.907234 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/d64b61be-4212-49da-9497-f567efa53a45-rbac\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.907301 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/11400c14-964a-494f-80da-d878c6d2a50d-tls-secret\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.907335 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/d64b61be-4212-49da-9497-f567efa53a45-tls-secret\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.907361 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhn4q\" (UniqueName: \"kubernetes.io/projected/11400c14-964a-494f-80da-d878c6d2a50d-kube-api-access-qhn4q\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.907386 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/d64b61be-4212-49da-9497-f567efa53a45-tenants\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.907413 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11400c14-964a-494f-80da-d878c6d2a50d-logging-loki-ca-bundle\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.907433 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/11400c14-964a-494f-80da-d878c6d2a50d-tenants\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.907459 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d64b61be-4212-49da-9497-f567efa53a45-logging-loki-ca-bundle\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.907488 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/11400c14-964a-494f-80da-d878c6d2a50d-rbac\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.907530 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/d64b61be-4212-49da-9497-f567efa53a45-lokistack-gateway\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.907562 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/11400c14-964a-494f-80da-d878c6d2a50d-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.907588 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d64b61be-4212-49da-9497-f567efa53a45-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.907609 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvtlc\" (UniqueName: \"kubernetes.io/projected/d64b61be-4212-49da-9497-f567efa53a45-kube-api-access-wvtlc\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.907638 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11400c14-964a-494f-80da-d878c6d2a50d-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.907663 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/11400c14-964a-494f-80da-d878c6d2a50d-lokistack-gateway\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.907691 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/d64b61be-4212-49da-9497-f567efa53a45-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:13 crc kubenswrapper[4776]: I1208 09:14:13.947714 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.009383 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/d64b61be-4212-49da-9497-f567efa53a45-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.009439 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/d64b61be-4212-49da-9497-f567efa53a45-rbac\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.009469 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/11400c14-964a-494f-80da-d878c6d2a50d-tls-secret\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.009493 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/d64b61be-4212-49da-9497-f567efa53a45-tls-secret\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.009512 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhn4q\" (UniqueName: \"kubernetes.io/projected/11400c14-964a-494f-80da-d878c6d2a50d-kube-api-access-qhn4q\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.009529 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/d64b61be-4212-49da-9497-f567efa53a45-tenants\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.009550 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11400c14-964a-494f-80da-d878c6d2a50d-logging-loki-ca-bundle\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.009568 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/11400c14-964a-494f-80da-d878c6d2a50d-tenants\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.009589 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d64b61be-4212-49da-9497-f567efa53a45-logging-loki-ca-bundle\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.009618 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/11400c14-964a-494f-80da-d878c6d2a50d-rbac\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.009640 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/d64b61be-4212-49da-9497-f567efa53a45-lokistack-gateway\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.009664 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/11400c14-964a-494f-80da-d878c6d2a50d-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.009681 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvtlc\" (UniqueName: \"kubernetes.io/projected/d64b61be-4212-49da-9497-f567efa53a45-kube-api-access-wvtlc\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.009697 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d64b61be-4212-49da-9497-f567efa53a45-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.009719 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11400c14-964a-494f-80da-d878c6d2a50d-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.009739 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/11400c14-964a-494f-80da-d878c6d2a50d-lokistack-gateway\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.011576 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d64b61be-4212-49da-9497-f567efa53a45-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.012450 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d64b61be-4212-49da-9497-f567efa53a45-logging-loki-ca-bundle\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.012880 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/11400c14-964a-494f-80da-d878c6d2a50d-rbac\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.012882 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/d64b61be-4212-49da-9497-f567efa53a45-rbac\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.013416 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11400c14-964a-494f-80da-d878c6d2a50d-logging-loki-ca-bundle\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.013741 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/d64b61be-4212-49da-9497-f567efa53a45-lokistack-gateway\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.013939 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11400c14-964a-494f-80da-d878c6d2a50d-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.014141 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/11400c14-964a-494f-80da-d878c6d2a50d-lokistack-gateway\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.016143 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/11400c14-964a-494f-80da-d878c6d2a50d-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.017629 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/d64b61be-4212-49da-9497-f567efa53a45-tls-secret\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.017763 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/d64b61be-4212-49da-9497-f567efa53a45-tenants\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.019035 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/11400c14-964a-494f-80da-d878c6d2a50d-tls-secret\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.023938 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/d64b61be-4212-49da-9497-f567efa53a45-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.028197 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvtlc\" (UniqueName: \"kubernetes.io/projected/d64b61be-4212-49da-9497-f567efa53a45-kube-api-access-wvtlc\") pod \"logging-loki-gateway-54b997fdcc-nmww6\" (UID: \"d64b61be-4212-49da-9497-f567efa53a45\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.030814 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/11400c14-964a-494f-80da-d878c6d2a50d-tenants\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.035684 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhn4q\" (UniqueName: \"kubernetes.io/projected/11400c14-964a-494f-80da-d878c6d2a50d-kube-api-access-qhn4q\") pod \"logging-loki-gateway-54b997fdcc-9dpbh\" (UID: \"11400c14-964a-494f-80da-d878c6d2a50d\") " pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.080021 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.098680 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.188459 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-w6292"] Dec 08 09:14:14 crc kubenswrapper[4776]: W1208 09:14:14.188799 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9edcc5bd_cefb_4c32_89e3_24ff105358b2.slice/crio-8d014b82ce6f05827db68893742e539d5050437aa990826d1df3748e346b4a7a WatchSource:0}: Error finding container 8d014b82ce6f05827db68893742e539d5050437aa990826d1df3748e346b4a7a: Status 404 returned error can't find the container with id 8d014b82ce6f05827db68893742e539d5050437aa990826d1df3748e346b4a7a Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.251796 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-hwrdk"] Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.331837 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.335903 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.344019 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.344397 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.370078 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.445649 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j"] Dec 08 09:14:14 crc kubenswrapper[4776]: W1208 09:14:14.445722 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaed8f23a_7437_4eab_8dae_6ff17f9a5aa0.slice/crio-c960a6db27a80e56f40be5be6b87fdffa07d5b74291b8dc86b0eec59bb580e2c WatchSource:0}: Error finding container c960a6db27a80e56f40be5be6b87fdffa07d5b74291b8dc86b0eec59bb580e2c: Status 404 returned error can't find the container with id c960a6db27a80e56f40be5be6b87fdffa07d5b74291b8dc86b0eec59bb580e2c Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.506062 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.507019 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.509000 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.509650 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.522592 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.527969 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6e565a79-b5fb-4e7a-88ba-d782b5b47322\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e565a79-b5fb-4e7a-88ba-d782b5b47322\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.528022 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3315c615-f930-4f82-a423-f6da171a2385\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3315c615-f930-4f82-a423-f6da171a2385\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.528060 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/c27c1242-5109-4547-8276-2dea60fad775-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.528127 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c27c1242-5109-4547-8276-2dea60fad775-config\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.528144 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c27c1242-5109-4547-8276-2dea60fad775-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.528197 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/c27c1242-5109-4547-8276-2dea60fad775-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.528512 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c27c1242-5109-4547-8276-2dea60fad775-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.528539 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd6rm\" (UniqueName: \"kubernetes.io/projected/c27c1242-5109-4547-8276-2dea60fad775-kube-api-access-xd6rm\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.594933 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-54b997fdcc-nmww6"] Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.629722 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ngl6\" (UniqueName: \"kubernetes.io/projected/62fa460d-4457-4db0-8be1-d7fa62fd7144-kube-api-access-6ngl6\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.629794 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c27c1242-5109-4547-8276-2dea60fad775-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.630018 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd6rm\" (UniqueName: \"kubernetes.io/projected/c27c1242-5109-4547-8276-2dea60fad775-kube-api-access-xd6rm\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.630099 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6e565a79-b5fb-4e7a-88ba-d782b5b47322\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e565a79-b5fb-4e7a-88ba-d782b5b47322\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.630124 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62fa460d-4457-4db0-8be1-d7fa62fd7144-config\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.630142 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/62fa460d-4457-4db0-8be1-d7fa62fd7144-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.630205 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3315c615-f930-4f82-a423-f6da171a2385\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3315c615-f930-4f82-a423-f6da171a2385\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.630228 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fa460d-4457-4db0-8be1-d7fa62fd7144-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.630247 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/c27c1242-5109-4547-8276-2dea60fad775-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.630274 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a706924a-84c8-43b7-99e5-45524ffcfa11\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a706924a-84c8-43b7-99e5-45524ffcfa11\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.630303 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/62fa460d-4457-4db0-8be1-d7fa62fd7144-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.630326 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c27c1242-5109-4547-8276-2dea60fad775-config\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.630342 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c27c1242-5109-4547-8276-2dea60fad775-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.630368 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/c27c1242-5109-4547-8276-2dea60fad775-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.630388 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/62fa460d-4457-4db0-8be1-d7fa62fd7144-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.631980 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c27c1242-5109-4547-8276-2dea60fad775-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.632029 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c27c1242-5109-4547-8276-2dea60fad775-config\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.635650 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/c27c1242-5109-4547-8276-2dea60fad775-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.637305 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/c27c1242-5109-4547-8276-2dea60fad775-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.637560 4776 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.637608 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6e565a79-b5fb-4e7a-88ba-d782b5b47322\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e565a79-b5fb-4e7a-88ba-d782b5b47322\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/739478cb4e4c7006f2774d162c279604413e1139463db725b8e4304a47588446/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.637626 4776 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.637661 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3315c615-f930-4f82-a423-f6da171a2385\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3315c615-f930-4f82-a423-f6da171a2385\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6372643099481417621353687f33c7ddf29c02afa6b26e6ae196eae2a8b498b8/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.640373 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c27c1242-5109-4547-8276-2dea60fad775-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.648413 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd6rm\" (UniqueName: \"kubernetes.io/projected/c27c1242-5109-4547-8276-2dea60fad775-kube-api-access-xd6rm\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.670816 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6e565a79-b5fb-4e7a-88ba-d782b5b47322\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e565a79-b5fb-4e7a-88ba-d782b5b47322\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.671346 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3315c615-f930-4f82-a423-f6da171a2385\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3315c615-f930-4f82-a423-f6da171a2385\") pod \"logging-loki-ingester-0\" (UID: \"c27c1242-5109-4547-8276-2dea60fad775\") " pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.731712 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/62fa460d-4457-4db0-8be1-d7fa62fd7144-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.731772 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ngl6\" (UniqueName: \"kubernetes.io/projected/62fa460d-4457-4db0-8be1-d7fa62fd7144-kube-api-access-6ngl6\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.731850 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62fa460d-4457-4db0-8be1-d7fa62fd7144-config\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.731896 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/62fa460d-4457-4db0-8be1-d7fa62fd7144-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.731924 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fa460d-4457-4db0-8be1-d7fa62fd7144-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.731951 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a706924a-84c8-43b7-99e5-45524ffcfa11\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a706924a-84c8-43b7-99e5-45524ffcfa11\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.731975 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/62fa460d-4457-4db0-8be1-d7fa62fd7144-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.733100 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62fa460d-4457-4db0-8be1-d7fa62fd7144-config\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.736671 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fa460d-4457-4db0-8be1-d7fa62fd7144-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.737817 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/62fa460d-4457-4db0-8be1-d7fa62fd7144-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.737820 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/62fa460d-4457-4db0-8be1-d7fa62fd7144-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.738314 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/62fa460d-4457-4db0-8be1-d7fa62fd7144-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.754167 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 08 09:14:14 crc kubenswrapper[4776]: W1208 09:14:14.754261 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11400c14_964a_494f_80da_d878c6d2a50d.slice/crio-0d6de6bf5aa964fd5d5a22ea1af6c97c256f3d5dc50daa946dc7f952e5447003 WatchSource:0}: Error finding container 0d6de6bf5aa964fd5d5a22ea1af6c97c256f3d5dc50daa946dc7f952e5447003: Status 404 returned error can't find the container with id 0d6de6bf5aa964fd5d5a22ea1af6c97c256f3d5dc50daa946dc7f952e5447003 Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.755089 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.756908 4776 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.756954 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a706924a-84c8-43b7-99e5-45524ffcfa11\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a706924a-84c8-43b7-99e5-45524ffcfa11\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7bda5f1080136a2427ab19835283b74ecbe7a78865afe864d82fe32a7b1c9f45/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.757780 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.758259 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.766208 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ngl6\" (UniqueName: \"kubernetes.io/projected/62fa460d-4457-4db0-8be1-d7fa62fd7144-kube-api-access-6ngl6\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.770970 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh"] Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.780032 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.800794 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a706924a-84c8-43b7-99e5-45524ffcfa11\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a706924a-84c8-43b7-99e5-45524ffcfa11\") pod \"logging-loki-compactor-0\" (UID: \"62fa460d-4457-4db0-8be1-d7fa62fd7144\") " pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.831600 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.833323 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cd052b7f-1bc5-470f-9a5f-d21cb3b4b18f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd052b7f-1bc5-470f-9a5f-d21cb3b4b18f\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.833384 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/1e9dd934-eb37-463c-890d-1021bbdc4e3f-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.833418 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e9dd934-eb37-463c-890d-1021bbdc4e3f-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.833464 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/1e9dd934-eb37-463c-890d-1021bbdc4e3f-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.833640 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/1e9dd934-eb37-463c-890d-1021bbdc4e3f-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.833909 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swx6x\" (UniqueName: \"kubernetes.io/projected/1e9dd934-eb37-463c-890d-1021bbdc4e3f-kube-api-access-swx6x\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.834023 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e9dd934-eb37-463c-890d-1021bbdc4e3f-config\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.936591 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/1e9dd934-eb37-463c-890d-1021bbdc4e3f-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.937207 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swx6x\" (UniqueName: \"kubernetes.io/projected/1e9dd934-eb37-463c-890d-1021bbdc4e3f-kube-api-access-swx6x\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.937243 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e9dd934-eb37-463c-890d-1021bbdc4e3f-config\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.937374 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cd052b7f-1bc5-470f-9a5f-d21cb3b4b18f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd052b7f-1bc5-470f-9a5f-d21cb3b4b18f\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.937421 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/1e9dd934-eb37-463c-890d-1021bbdc4e3f-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.937471 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e9dd934-eb37-463c-890d-1021bbdc4e3f-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.937540 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/1e9dd934-eb37-463c-890d-1021bbdc4e3f-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.938550 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e9dd934-eb37-463c-890d-1021bbdc4e3f-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.938742 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e9dd934-eb37-463c-890d-1021bbdc4e3f-config\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.940411 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/1e9dd934-eb37-463c-890d-1021bbdc4e3f-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.941402 4776 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.941432 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cd052b7f-1bc5-470f-9a5f-d21cb3b4b18f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd052b7f-1bc5-470f-9a5f-d21cb3b4b18f\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/87c07f49aac9df28ba14060b4eb167b5a7b01e47c898319b5d8600fbe4c5207d/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.941443 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/1e9dd934-eb37-463c-890d-1021bbdc4e3f-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.941451 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/1e9dd934-eb37-463c-890d-1021bbdc4e3f-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.959989 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swx6x\" (UniqueName: \"kubernetes.io/projected/1e9dd934-eb37-463c-890d-1021bbdc4e3f-kube-api-access-swx6x\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.968129 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:14 crc kubenswrapper[4776]: I1208 09:14:14.981057 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cd052b7f-1bc5-470f-9a5f-d21cb3b4b18f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd052b7f-1bc5-470f-9a5f-d21cb3b4b18f\") pod \"logging-loki-index-gateway-0\" (UID: \"1e9dd934-eb37-463c-890d-1021bbdc4e3f\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:15 crc kubenswrapper[4776]: I1208 09:14:15.034216 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 08 09:14:15 crc kubenswrapper[4776]: I1208 09:14:15.090659 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:15 crc kubenswrapper[4776]: I1208 09:14:15.099874 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"62fa460d-4457-4db0-8be1-d7fa62fd7144","Type":"ContainerStarted","Data":"c90305f04b8708cdd01a10514faf4729670fb9384095d8393339ea24586a8e37"} Dec 08 09:14:15 crc kubenswrapper[4776]: I1208 09:14:15.101206 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" event={"ID":"9edcc5bd-cefb-4c32-89e3-24ff105358b2","Type":"ContainerStarted","Data":"8d014b82ce6f05827db68893742e539d5050437aa990826d1df3748e346b4a7a"} Dec 08 09:14:15 crc kubenswrapper[4776]: I1208 09:14:15.102310 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" event={"ID":"d64b61be-4212-49da-9497-f567efa53a45","Type":"ContainerStarted","Data":"8f391d0ccae515e979cf0e97549a30624d394a7315b58fea4c608e351ff42054"} Dec 08 09:14:15 crc kubenswrapper[4776]: I1208 09:14:15.103765 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" event={"ID":"11400c14-964a-494f-80da-d878c6d2a50d","Type":"ContainerStarted","Data":"0d6de6bf5aa964fd5d5a22ea1af6c97c256f3d5dc50daa946dc7f952e5447003"} Dec 08 09:14:15 crc kubenswrapper[4776]: I1208 09:14:15.105248 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" event={"ID":"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0","Type":"ContainerStarted","Data":"c960a6db27a80e56f40be5be6b87fdffa07d5b74291b8dc86b0eec59bb580e2c"} Dec 08 09:14:15 crc kubenswrapper[4776]: I1208 09:14:15.106097 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" event={"ID":"71eade59-504f-4431-8cd8-531883c1eba7","Type":"ContainerStarted","Data":"e760110c738953749e4bbd3c8d2e3258e8a60e37d0e1ee86266d2b0f473c23da"} Dec 08 09:14:15 crc kubenswrapper[4776]: I1208 09:14:15.397552 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 08 09:14:15 crc kubenswrapper[4776]: W1208 09:14:15.413515 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc27c1242_5109_4547_8276_2dea60fad775.slice/crio-56a5f8216f7a5f3d84f18239e50cb24b0cdbdd90f96b22ed1e89b53c60da9a6a WatchSource:0}: Error finding container 56a5f8216f7a5f3d84f18239e50cb24b0cdbdd90f96b22ed1e89b53c60da9a6a: Status 404 returned error can't find the container with id 56a5f8216f7a5f3d84f18239e50cb24b0cdbdd90f96b22ed1e89b53c60da9a6a Dec 08 09:14:15 crc kubenswrapper[4776]: I1208 09:14:15.575938 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 08 09:14:16 crc kubenswrapper[4776]: I1208 09:14:16.115905 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"1e9dd934-eb37-463c-890d-1021bbdc4e3f","Type":"ContainerStarted","Data":"0aada85ddc15c6ebb28ef49368ce61b48a30c6508b68e92bcc91917a934dcdab"} Dec 08 09:14:16 crc kubenswrapper[4776]: I1208 09:14:16.117269 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"c27c1242-5109-4547-8276-2dea60fad775","Type":"ContainerStarted","Data":"56a5f8216f7a5f3d84f18239e50cb24b0cdbdd90f96b22ed1e89b53c60da9a6a"} Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.139996 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" event={"ID":"11400c14-964a-494f-80da-d878c6d2a50d","Type":"ContainerStarted","Data":"488256f0d28967acadf8e1679b726c64f50df848656f1d69841af01fe22a974e"} Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.142108 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" event={"ID":"71eade59-504f-4431-8cd8-531883c1eba7","Type":"ContainerStarted","Data":"00c1397cecd5e9b6cf993e7665b8ac5f5456fb6cd070f48213d2f6fcacda9d7f"} Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.143483 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"1e9dd934-eb37-463c-890d-1021bbdc4e3f","Type":"ContainerStarted","Data":"8020cb2f12ad8c9c3d0f2ac0329830129a12435ec9dfcddb6beccfb669e647e1"} Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.143569 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.144771 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"62fa460d-4457-4db0-8be1-d7fa62fd7144","Type":"ContainerStarted","Data":"d1857f3ecea722b69a6f15266827dc95ac7b255351f7dd625ff5d57c6ab4322c"} Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.144852 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.147005 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" event={"ID":"9edcc5bd-cefb-4c32-89e3-24ff105358b2","Type":"ContainerStarted","Data":"64d8cc9e6b45b4f60f407c2312259e3bba2cd377ade52f57238034be1208ccf1"} Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.148449 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" event={"ID":"d64b61be-4212-49da-9497-f567efa53a45","Type":"ContainerStarted","Data":"cf788a117f3726a90c248d522e086ce0c3258e3a867a19f006ae60b84c994ead"} Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.149803 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" event={"ID":"aed8f23a-7437-4eab-8dae-6ff17f9a5aa0","Type":"ContainerStarted","Data":"f32c8e15dd70b5e7ab9a40a6a4d0f831955dbff8e79a9e5b9ddcbeb8259ff1b4"} Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.150300 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.151481 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"c27c1242-5109-4547-8276-2dea60fad775","Type":"ContainerStarted","Data":"ead7897226628479377e375f996e9ecfe64f12a010d4f668ea9dd716712be922"} Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.151964 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.191959 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" podStartSLOduration=2.521537964 podStartE2EDuration="6.191937754s" podCreationTimestamp="2025-12-08 09:14:13 +0000 UTC" firstStartedPulling="2025-12-08 09:14:14.270299976 +0000 UTC m=+930.533524998" lastFinishedPulling="2025-12-08 09:14:17.940699766 +0000 UTC m=+934.203924788" observedRunningTime="2025-12-08 09:14:19.172964063 +0000 UTC m=+935.436189095" watchObservedRunningTime="2025-12-08 09:14:19.191937754 +0000 UTC m=+935.455162776" Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.193236 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" podStartSLOduration=2.435612177 podStartE2EDuration="6.19322607s" podCreationTimestamp="2025-12-08 09:14:13 +0000 UTC" firstStartedPulling="2025-12-08 09:14:14.192742583 +0000 UTC m=+930.455967605" lastFinishedPulling="2025-12-08 09:14:17.950356476 +0000 UTC m=+934.213581498" observedRunningTime="2025-12-08 09:14:19.18470745 +0000 UTC m=+935.447932482" watchObservedRunningTime="2025-12-08 09:14:19.19322607 +0000 UTC m=+935.456451102" Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.232963 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.405626168 podStartE2EDuration="6.232940521s" podCreationTimestamp="2025-12-08 09:14:13 +0000 UTC" firstStartedPulling="2025-12-08 09:14:15.058850861 +0000 UTC m=+931.322075883" lastFinishedPulling="2025-12-08 09:14:17.886165224 +0000 UTC m=+934.149390236" observedRunningTime="2025-12-08 09:14:19.216389904 +0000 UTC m=+935.479614936" watchObservedRunningTime="2025-12-08 09:14:19.232940521 +0000 UTC m=+935.496165543" Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.233547 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.696587148 podStartE2EDuration="6.233541417s" podCreationTimestamp="2025-12-08 09:14:13 +0000 UTC" firstStartedPulling="2025-12-08 09:14:15.415302218 +0000 UTC m=+931.678527240" lastFinishedPulling="2025-12-08 09:14:17.952256487 +0000 UTC m=+934.215481509" observedRunningTime="2025-12-08 09:14:19.229759295 +0000 UTC m=+935.492984337" watchObservedRunningTime="2025-12-08 09:14:19.233541417 +0000 UTC m=+935.496766439" Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.253380 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" podStartSLOduration=2.799157525 podStartE2EDuration="6.253363342s" podCreationTimestamp="2025-12-08 09:14:13 +0000 UTC" firstStartedPulling="2025-12-08 09:14:14.447060625 +0000 UTC m=+930.710285647" lastFinishedPulling="2025-12-08 09:14:17.901266442 +0000 UTC m=+934.164491464" observedRunningTime="2025-12-08 09:14:19.247658928 +0000 UTC m=+935.510883950" watchObservedRunningTime="2025-12-08 09:14:19.253363342 +0000 UTC m=+935.516588364" Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.277132 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.917270114 podStartE2EDuration="6.277114553s" podCreationTimestamp="2025-12-08 09:14:13 +0000 UTC" firstStartedPulling="2025-12-08 09:14:15.592366347 +0000 UTC m=+931.855591369" lastFinishedPulling="2025-12-08 09:14:17.952210776 +0000 UTC m=+934.215435808" observedRunningTime="2025-12-08 09:14:19.270834324 +0000 UTC m=+935.534059356" watchObservedRunningTime="2025-12-08 09:14:19.277114553 +0000 UTC m=+935.540339575" Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.389902 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.433380 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:14:19 crc kubenswrapper[4776]: I1208 09:14:19.619724 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z6frj"] Dec 08 09:14:20 crc kubenswrapper[4776]: I1208 09:14:20.160077 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:20 crc kubenswrapper[4776]: I1208 09:14:20.160132 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.164948 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" event={"ID":"d64b61be-4212-49da-9497-f567efa53a45","Type":"ContainerStarted","Data":"78445c7c308bc9570889f6080660eaac4bca3da456e6acce31e686717be4da6a"} Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.165389 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.165414 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.168536 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" event={"ID":"11400c14-964a-494f-80da-d878c6d2a50d","Type":"ContainerStarted","Data":"60703f41bfe5c474153d768c3d2b1c34b84f82b8c208cc97bd5a99f683b7e8df"} Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.168680 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z6frj" podUID="1f8de8d8-d342-49bd-9b86-631dd33282b6" containerName="registry-server" containerID="cri-o://2cc496802e39d36f585d9bd017dad7bb9de08db666503180027ec0f559ef14a0" gracePeriod=2 Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.180516 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.187950 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" podStartSLOduration=2.316140133 podStartE2EDuration="8.187928828s" podCreationTimestamp="2025-12-08 09:14:13 +0000 UTC" firstStartedPulling="2025-12-08 09:14:14.598998404 +0000 UTC m=+930.862223416" lastFinishedPulling="2025-12-08 09:14:20.470787099 +0000 UTC m=+936.734012111" observedRunningTime="2025-12-08 09:14:21.18540885 +0000 UTC m=+937.448633882" watchObservedRunningTime="2025-12-08 09:14:21.187928828 +0000 UTC m=+937.451153860" Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.193011 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-54b997fdcc-nmww6" Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.235702 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" podStartSLOduration=2.53319292 podStartE2EDuration="8.235684736s" podCreationTimestamp="2025-12-08 09:14:13 +0000 UTC" firstStartedPulling="2025-12-08 09:14:14.760533123 +0000 UTC m=+931.023758145" lastFinishedPulling="2025-12-08 09:14:20.463024939 +0000 UTC m=+936.726249961" observedRunningTime="2025-12-08 09:14:21.23211239 +0000 UTC m=+937.495337422" watchObservedRunningTime="2025-12-08 09:14:21.235684736 +0000 UTC m=+937.498909758" Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.695421 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.782345 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f8de8d8-d342-49bd-9b86-631dd33282b6-catalog-content\") pod \"1f8de8d8-d342-49bd-9b86-631dd33282b6\" (UID: \"1f8de8d8-d342-49bd-9b86-631dd33282b6\") " Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.782500 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f8de8d8-d342-49bd-9b86-631dd33282b6-utilities\") pod \"1f8de8d8-d342-49bd-9b86-631dd33282b6\" (UID: \"1f8de8d8-d342-49bd-9b86-631dd33282b6\") " Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.782578 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csxnb\" (UniqueName: \"kubernetes.io/projected/1f8de8d8-d342-49bd-9b86-631dd33282b6-kube-api-access-csxnb\") pod \"1f8de8d8-d342-49bd-9b86-631dd33282b6\" (UID: \"1f8de8d8-d342-49bd-9b86-631dd33282b6\") " Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.784157 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f8de8d8-d342-49bd-9b86-631dd33282b6-utilities" (OuterVolumeSpecName: "utilities") pod "1f8de8d8-d342-49bd-9b86-631dd33282b6" (UID: "1f8de8d8-d342-49bd-9b86-631dd33282b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.790153 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f8de8d8-d342-49bd-9b86-631dd33282b6-kube-api-access-csxnb" (OuterVolumeSpecName: "kube-api-access-csxnb") pod "1f8de8d8-d342-49bd-9b86-631dd33282b6" (UID: "1f8de8d8-d342-49bd-9b86-631dd33282b6"). InnerVolumeSpecName "kube-api-access-csxnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.832273 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f8de8d8-d342-49bd-9b86-631dd33282b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f8de8d8-d342-49bd-9b86-631dd33282b6" (UID: "1f8de8d8-d342-49bd-9b86-631dd33282b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.884372 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csxnb\" (UniqueName: \"kubernetes.io/projected/1f8de8d8-d342-49bd-9b86-631dd33282b6-kube-api-access-csxnb\") on node \"crc\" DevicePath \"\"" Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.884417 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f8de8d8-d342-49bd-9b86-631dd33282b6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:14:21 crc kubenswrapper[4776]: I1208 09:14:21.884427 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f8de8d8-d342-49bd-9b86-631dd33282b6-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.177201 4776 generic.go:334] "Generic (PLEG): container finished" podID="1f8de8d8-d342-49bd-9b86-631dd33282b6" containerID="2cc496802e39d36f585d9bd017dad7bb9de08db666503180027ec0f559ef14a0" exitCode=0 Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.177362 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6frj" event={"ID":"1f8de8d8-d342-49bd-9b86-631dd33282b6","Type":"ContainerDied","Data":"2cc496802e39d36f585d9bd017dad7bb9de08db666503180027ec0f559ef14a0"} Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.177446 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6frj" Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.178357 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6frj" event={"ID":"1f8de8d8-d342-49bd-9b86-631dd33282b6","Type":"ContainerDied","Data":"cd413c55293438e2afe263219c2b092802f37c2c9999d6ba86bd5645e5447a26"} Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.178387 4776 scope.go:117] "RemoveContainer" containerID="2cc496802e39d36f585d9bd017dad7bb9de08db666503180027ec0f559ef14a0" Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.178534 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.178836 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.188298 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.190443 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-54b997fdcc-9dpbh" Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.197548 4776 scope.go:117] "RemoveContainer" containerID="ae583db904253c21579a4e1b9497a3e9196a555e93b6dca0667fafea0da45180" Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.233301 4776 scope.go:117] "RemoveContainer" containerID="b597a4c00fd7e7aa18a5fb0913f4689d111d9eced86e43d6362e4ae6c1e517dd" Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.261201 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z6frj"] Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.266033 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z6frj"] Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.280098 4776 scope.go:117] "RemoveContainer" containerID="2cc496802e39d36f585d9bd017dad7bb9de08db666503180027ec0f559ef14a0" Dec 08 09:14:22 crc kubenswrapper[4776]: E1208 09:14:22.280484 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cc496802e39d36f585d9bd017dad7bb9de08db666503180027ec0f559ef14a0\": container with ID starting with 2cc496802e39d36f585d9bd017dad7bb9de08db666503180027ec0f559ef14a0 not found: ID does not exist" containerID="2cc496802e39d36f585d9bd017dad7bb9de08db666503180027ec0f559ef14a0" Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.280513 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc496802e39d36f585d9bd017dad7bb9de08db666503180027ec0f559ef14a0"} err="failed to get container status \"2cc496802e39d36f585d9bd017dad7bb9de08db666503180027ec0f559ef14a0\": rpc error: code = NotFound desc = could not find container \"2cc496802e39d36f585d9bd017dad7bb9de08db666503180027ec0f559ef14a0\": container with ID starting with 2cc496802e39d36f585d9bd017dad7bb9de08db666503180027ec0f559ef14a0 not found: ID does not exist" Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.280534 4776 scope.go:117] "RemoveContainer" containerID="ae583db904253c21579a4e1b9497a3e9196a555e93b6dca0667fafea0da45180" Dec 08 09:14:22 crc kubenswrapper[4776]: E1208 09:14:22.280701 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae583db904253c21579a4e1b9497a3e9196a555e93b6dca0667fafea0da45180\": container with ID starting with ae583db904253c21579a4e1b9497a3e9196a555e93b6dca0667fafea0da45180 not found: ID does not exist" containerID="ae583db904253c21579a4e1b9497a3e9196a555e93b6dca0667fafea0da45180" Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.280728 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae583db904253c21579a4e1b9497a3e9196a555e93b6dca0667fafea0da45180"} err="failed to get container status \"ae583db904253c21579a4e1b9497a3e9196a555e93b6dca0667fafea0da45180\": rpc error: code = NotFound desc = could not find container \"ae583db904253c21579a4e1b9497a3e9196a555e93b6dca0667fafea0da45180\": container with ID starting with ae583db904253c21579a4e1b9497a3e9196a555e93b6dca0667fafea0da45180 not found: ID does not exist" Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.280740 4776 scope.go:117] "RemoveContainer" containerID="b597a4c00fd7e7aa18a5fb0913f4689d111d9eced86e43d6362e4ae6c1e517dd" Dec 08 09:14:22 crc kubenswrapper[4776]: E1208 09:14:22.280901 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b597a4c00fd7e7aa18a5fb0913f4689d111d9eced86e43d6362e4ae6c1e517dd\": container with ID starting with b597a4c00fd7e7aa18a5fb0913f4689d111d9eced86e43d6362e4ae6c1e517dd not found: ID does not exist" containerID="b597a4c00fd7e7aa18a5fb0913f4689d111d9eced86e43d6362e4ae6c1e517dd" Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.280921 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b597a4c00fd7e7aa18a5fb0913f4689d111d9eced86e43d6362e4ae6c1e517dd"} err="failed to get container status \"b597a4c00fd7e7aa18a5fb0913f4689d111d9eced86e43d6362e4ae6c1e517dd\": rpc error: code = NotFound desc = could not find container \"b597a4c00fd7e7aa18a5fb0913f4689d111d9eced86e43d6362e4ae6c1e517dd\": container with ID starting with b597a4c00fd7e7aa18a5fb0913f4689d111d9eced86e43d6362e4ae6c1e517dd not found: ID does not exist" Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.359324 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f8de8d8-d342-49bd-9b86-631dd33282b6" path="/var/lib/kubelet/pods/1f8de8d8-d342-49bd-9b86-631dd33282b6/volumes" Dec 08 09:14:22 crc kubenswrapper[4776]: I1208 09:14:22.836633 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:25 crc kubenswrapper[4776]: I1208 09:14:25.022023 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgfhw"] Dec 08 09:14:25 crc kubenswrapper[4776]: I1208 09:14:25.022560 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kgfhw" podUID="c9e6333e-c8a3-430c-8cf9-c51ee109a164" containerName="registry-server" containerID="cri-o://83577b6a40faa3652f4a81c9995618b81e0b35db8f8aa2d4e20fb498102967b8" gracePeriod=2 Dec 08 09:14:25 crc kubenswrapper[4776]: I1208 09:14:25.205871 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9e6333e-c8a3-430c-8cf9-c51ee109a164" containerID="83577b6a40faa3652f4a81c9995618b81e0b35db8f8aa2d4e20fb498102967b8" exitCode=0 Dec 08 09:14:25 crc kubenswrapper[4776]: I1208 09:14:25.206257 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgfhw" event={"ID":"c9e6333e-c8a3-430c-8cf9-c51ee109a164","Type":"ContainerDied","Data":"83577b6a40faa3652f4a81c9995618b81e0b35db8f8aa2d4e20fb498102967b8"} Dec 08 09:14:25 crc kubenswrapper[4776]: I1208 09:14:25.429127 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:25 crc kubenswrapper[4776]: I1208 09:14:25.545166 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvvpm\" (UniqueName: \"kubernetes.io/projected/c9e6333e-c8a3-430c-8cf9-c51ee109a164-kube-api-access-tvvpm\") pod \"c9e6333e-c8a3-430c-8cf9-c51ee109a164\" (UID: \"c9e6333e-c8a3-430c-8cf9-c51ee109a164\") " Dec 08 09:14:25 crc kubenswrapper[4776]: I1208 09:14:25.545316 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e6333e-c8a3-430c-8cf9-c51ee109a164-catalog-content\") pod \"c9e6333e-c8a3-430c-8cf9-c51ee109a164\" (UID: \"c9e6333e-c8a3-430c-8cf9-c51ee109a164\") " Dec 08 09:14:25 crc kubenswrapper[4776]: I1208 09:14:25.545362 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e6333e-c8a3-430c-8cf9-c51ee109a164-utilities\") pod \"c9e6333e-c8a3-430c-8cf9-c51ee109a164\" (UID: \"c9e6333e-c8a3-430c-8cf9-c51ee109a164\") " Dec 08 09:14:25 crc kubenswrapper[4776]: I1208 09:14:25.546198 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e6333e-c8a3-430c-8cf9-c51ee109a164-utilities" (OuterVolumeSpecName: "utilities") pod "c9e6333e-c8a3-430c-8cf9-c51ee109a164" (UID: "c9e6333e-c8a3-430c-8cf9-c51ee109a164"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:14:25 crc kubenswrapper[4776]: I1208 09:14:25.553370 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e6333e-c8a3-430c-8cf9-c51ee109a164-kube-api-access-tvvpm" (OuterVolumeSpecName: "kube-api-access-tvvpm") pod "c9e6333e-c8a3-430c-8cf9-c51ee109a164" (UID: "c9e6333e-c8a3-430c-8cf9-c51ee109a164"). InnerVolumeSpecName "kube-api-access-tvvpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:14:25 crc kubenswrapper[4776]: I1208 09:14:25.577476 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e6333e-c8a3-430c-8cf9-c51ee109a164-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9e6333e-c8a3-430c-8cf9-c51ee109a164" (UID: "c9e6333e-c8a3-430c-8cf9-c51ee109a164"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:14:25 crc kubenswrapper[4776]: I1208 09:14:25.647439 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvvpm\" (UniqueName: \"kubernetes.io/projected/c9e6333e-c8a3-430c-8cf9-c51ee109a164-kube-api-access-tvvpm\") on node \"crc\" DevicePath \"\"" Dec 08 09:14:25 crc kubenswrapper[4776]: I1208 09:14:25.647484 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e6333e-c8a3-430c-8cf9-c51ee109a164-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:14:25 crc kubenswrapper[4776]: I1208 09:14:25.647494 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e6333e-c8a3-430c-8cf9-c51ee109a164-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:14:26 crc kubenswrapper[4776]: I1208 09:14:26.213889 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgfhw" event={"ID":"c9e6333e-c8a3-430c-8cf9-c51ee109a164","Type":"ContainerDied","Data":"a8df582a99023c6e7b9d1f6ef73677591a9d2372d42c4cd9ecee462f01c7bf3e"} Dec 08 09:14:26 crc kubenswrapper[4776]: I1208 09:14:26.213937 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgfhw" Dec 08 09:14:26 crc kubenswrapper[4776]: I1208 09:14:26.213966 4776 scope.go:117] "RemoveContainer" containerID="83577b6a40faa3652f4a81c9995618b81e0b35db8f8aa2d4e20fb498102967b8" Dec 08 09:14:26 crc kubenswrapper[4776]: I1208 09:14:26.240815 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgfhw"] Dec 08 09:14:26 crc kubenswrapper[4776]: I1208 09:14:26.242349 4776 scope.go:117] "RemoveContainer" containerID="4ad4cba424382dd207f6c5f37e27545d4652f605ce78437f3fe7729a5e008bd7" Dec 08 09:14:26 crc kubenswrapper[4776]: I1208 09:14:26.248873 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgfhw"] Dec 08 09:14:26 crc kubenswrapper[4776]: I1208 09:14:26.279403 4776 scope.go:117] "RemoveContainer" containerID="0f501c5464d542b3c85440950e14c2886d9207ad683bacb42ee3b23587f4de1d" Dec 08 09:14:26 crc kubenswrapper[4776]: I1208 09:14:26.352308 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e6333e-c8a3-430c-8cf9-c51ee109a164" path="/var/lib/kubelet/pods/c9e6333e-c8a3-430c-8cf9-c51ee109a164/volumes" Dec 08 09:14:33 crc kubenswrapper[4776]: I1208 09:14:33.489940 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-76cc67bf56-w6292" Dec 08 09:14:33 crc kubenswrapper[4776]: I1208 09:14:33.660973 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-5895d59bb8-hwrdk" Dec 08 09:14:33 crc kubenswrapper[4776]: I1208 09:14:33.954136 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-fcz8j" Dec 08 09:14:34 crc kubenswrapper[4776]: I1208 09:14:34.836845 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Dec 08 09:14:34 crc kubenswrapper[4776]: I1208 09:14:34.975661 4776 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 08 09:14:34 crc kubenswrapper[4776]: I1208 09:14:34.975746 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="c27c1242-5109-4547-8276-2dea60fad775" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 08 09:14:35 crc kubenswrapper[4776]: I1208 09:14:35.096323 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Dec 08 09:14:41 crc kubenswrapper[4776]: I1208 09:14:41.399373 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:14:41 crc kubenswrapper[4776]: I1208 09:14:41.400017 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:14:44 crc kubenswrapper[4776]: I1208 09:14:44.975231 4776 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 08 09:14:44 crc kubenswrapper[4776]: I1208 09:14:44.975537 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="c27c1242-5109-4547-8276-2dea60fad775" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 08 09:14:54 crc kubenswrapper[4776]: I1208 09:14:54.972082 4776 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 08 09:14:54 crc kubenswrapper[4776]: I1208 09:14:54.972496 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="c27c1242-5109-4547-8276-2dea60fad775" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.168725 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv"] Dec 08 09:15:00 crc kubenswrapper[4776]: E1208 09:15:00.171222 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e6333e-c8a3-430c-8cf9-c51ee109a164" containerName="extract-content" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.171370 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e6333e-c8a3-430c-8cf9-c51ee109a164" containerName="extract-content" Dec 08 09:15:00 crc kubenswrapper[4776]: E1208 09:15:00.171495 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8de8d8-d342-49bd-9b86-631dd33282b6" containerName="extract-content" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.171643 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8de8d8-d342-49bd-9b86-631dd33282b6" containerName="extract-content" Dec 08 09:15:00 crc kubenswrapper[4776]: E1208 09:15:00.171770 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e6333e-c8a3-430c-8cf9-c51ee109a164" containerName="registry-server" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.171880 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e6333e-c8a3-430c-8cf9-c51ee109a164" containerName="registry-server" Dec 08 09:15:00 crc kubenswrapper[4776]: E1208 09:15:00.172006 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e6333e-c8a3-430c-8cf9-c51ee109a164" containerName="extract-utilities" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.172118 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e6333e-c8a3-430c-8cf9-c51ee109a164" containerName="extract-utilities" Dec 08 09:15:00 crc kubenswrapper[4776]: E1208 09:15:00.172279 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8de8d8-d342-49bd-9b86-631dd33282b6" containerName="extract-utilities" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.172406 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8de8d8-d342-49bd-9b86-631dd33282b6" containerName="extract-utilities" Dec 08 09:15:00 crc kubenswrapper[4776]: E1208 09:15:00.172536 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8de8d8-d342-49bd-9b86-631dd33282b6" containerName="registry-server" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.172647 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8de8d8-d342-49bd-9b86-631dd33282b6" containerName="registry-server" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.172995 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e6333e-c8a3-430c-8cf9-c51ee109a164" containerName="registry-server" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.173138 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8de8d8-d342-49bd-9b86-631dd33282b6" containerName="registry-server" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.174086 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.177028 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.177997 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.179304 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv"] Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.335143 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ce3381a-0bc7-4098-9c55-87bba4519ad8-secret-volume\") pod \"collect-profiles-29419755-kfkjv\" (UID: \"0ce3381a-0bc7-4098-9c55-87bba4519ad8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.335561 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ce3381a-0bc7-4098-9c55-87bba4519ad8-config-volume\") pod \"collect-profiles-29419755-kfkjv\" (UID: \"0ce3381a-0bc7-4098-9c55-87bba4519ad8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.335704 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gllhn\" (UniqueName: \"kubernetes.io/projected/0ce3381a-0bc7-4098-9c55-87bba4519ad8-kube-api-access-gllhn\") pod \"collect-profiles-29419755-kfkjv\" (UID: \"0ce3381a-0bc7-4098-9c55-87bba4519ad8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.437184 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ce3381a-0bc7-4098-9c55-87bba4519ad8-secret-volume\") pod \"collect-profiles-29419755-kfkjv\" (UID: \"0ce3381a-0bc7-4098-9c55-87bba4519ad8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.437311 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ce3381a-0bc7-4098-9c55-87bba4519ad8-config-volume\") pod \"collect-profiles-29419755-kfkjv\" (UID: \"0ce3381a-0bc7-4098-9c55-87bba4519ad8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.437353 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gllhn\" (UniqueName: \"kubernetes.io/projected/0ce3381a-0bc7-4098-9c55-87bba4519ad8-kube-api-access-gllhn\") pod \"collect-profiles-29419755-kfkjv\" (UID: \"0ce3381a-0bc7-4098-9c55-87bba4519ad8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.438624 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ce3381a-0bc7-4098-9c55-87bba4519ad8-config-volume\") pod \"collect-profiles-29419755-kfkjv\" (UID: \"0ce3381a-0bc7-4098-9c55-87bba4519ad8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.450102 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ce3381a-0bc7-4098-9c55-87bba4519ad8-secret-volume\") pod \"collect-profiles-29419755-kfkjv\" (UID: \"0ce3381a-0bc7-4098-9c55-87bba4519ad8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.460373 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gllhn\" (UniqueName: \"kubernetes.io/projected/0ce3381a-0bc7-4098-9c55-87bba4519ad8-kube-api-access-gllhn\") pod \"collect-profiles-29419755-kfkjv\" (UID: \"0ce3381a-0bc7-4098-9c55-87bba4519ad8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.551271 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv" Dec 08 09:15:00 crc kubenswrapper[4776]: I1208 09:15:00.974263 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv"] Dec 08 09:15:01 crc kubenswrapper[4776]: I1208 09:15:01.476379 4776 generic.go:334] "Generic (PLEG): container finished" podID="0ce3381a-0bc7-4098-9c55-87bba4519ad8" containerID="65dc4c88e5462f7bacac85606e7d58240f6fb6053e8cd91d2dd0b98b06880905" exitCode=0 Dec 08 09:15:01 crc kubenswrapper[4776]: I1208 09:15:01.476416 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv" event={"ID":"0ce3381a-0bc7-4098-9c55-87bba4519ad8","Type":"ContainerDied","Data":"65dc4c88e5462f7bacac85606e7d58240f6fb6053e8cd91d2dd0b98b06880905"} Dec 08 09:15:01 crc kubenswrapper[4776]: I1208 09:15:01.476639 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv" event={"ID":"0ce3381a-0bc7-4098-9c55-87bba4519ad8","Type":"ContainerStarted","Data":"05c5b16bacf66b5ee1e48946466022567b2f2deae9aaa509f6a52b877675b6aa"} Dec 08 09:15:02 crc kubenswrapper[4776]: I1208 09:15:02.752118 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv" Dec 08 09:15:02 crc kubenswrapper[4776]: I1208 09:15:02.781074 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gllhn\" (UniqueName: \"kubernetes.io/projected/0ce3381a-0bc7-4098-9c55-87bba4519ad8-kube-api-access-gllhn\") pod \"0ce3381a-0bc7-4098-9c55-87bba4519ad8\" (UID: \"0ce3381a-0bc7-4098-9c55-87bba4519ad8\") " Dec 08 09:15:02 crc kubenswrapper[4776]: I1208 09:15:02.781165 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ce3381a-0bc7-4098-9c55-87bba4519ad8-config-volume\") pod \"0ce3381a-0bc7-4098-9c55-87bba4519ad8\" (UID: \"0ce3381a-0bc7-4098-9c55-87bba4519ad8\") " Dec 08 09:15:02 crc kubenswrapper[4776]: I1208 09:15:02.781245 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ce3381a-0bc7-4098-9c55-87bba4519ad8-secret-volume\") pod \"0ce3381a-0bc7-4098-9c55-87bba4519ad8\" (UID: \"0ce3381a-0bc7-4098-9c55-87bba4519ad8\") " Dec 08 09:15:02 crc kubenswrapper[4776]: I1208 09:15:02.782030 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ce3381a-0bc7-4098-9c55-87bba4519ad8-config-volume" (OuterVolumeSpecName: "config-volume") pod "0ce3381a-0bc7-4098-9c55-87bba4519ad8" (UID: "0ce3381a-0bc7-4098-9c55-87bba4519ad8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:15:02 crc kubenswrapper[4776]: I1208 09:15:02.784853 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ce3381a-0bc7-4098-9c55-87bba4519ad8-config-volume\") on node \"crc\" DevicePath \"\"" Dec 08 09:15:02 crc kubenswrapper[4776]: I1208 09:15:02.788021 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce3381a-0bc7-4098-9c55-87bba4519ad8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0ce3381a-0bc7-4098-9c55-87bba4519ad8" (UID: "0ce3381a-0bc7-4098-9c55-87bba4519ad8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:15:02 crc kubenswrapper[4776]: I1208 09:15:02.788062 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce3381a-0bc7-4098-9c55-87bba4519ad8-kube-api-access-gllhn" (OuterVolumeSpecName: "kube-api-access-gllhn") pod "0ce3381a-0bc7-4098-9c55-87bba4519ad8" (UID: "0ce3381a-0bc7-4098-9c55-87bba4519ad8"). InnerVolumeSpecName "kube-api-access-gllhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:15:02 crc kubenswrapper[4776]: I1208 09:15:02.886088 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ce3381a-0bc7-4098-9c55-87bba4519ad8-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 08 09:15:02 crc kubenswrapper[4776]: I1208 09:15:02.886122 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gllhn\" (UniqueName: \"kubernetes.io/projected/0ce3381a-0bc7-4098-9c55-87bba4519ad8-kube-api-access-gllhn\") on node \"crc\" DevicePath \"\"" Dec 08 09:15:03 crc kubenswrapper[4776]: I1208 09:15:03.495247 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv" event={"ID":"0ce3381a-0bc7-4098-9c55-87bba4519ad8","Type":"ContainerDied","Data":"05c5b16bacf66b5ee1e48946466022567b2f2deae9aaa509f6a52b877675b6aa"} Dec 08 09:15:03 crc kubenswrapper[4776]: I1208 09:15:03.495879 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05c5b16bacf66b5ee1e48946466022567b2f2deae9aaa509f6a52b877675b6aa" Dec 08 09:15:03 crc kubenswrapper[4776]: I1208 09:15:03.495855 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv" Dec 08 09:15:04 crc kubenswrapper[4776]: I1208 09:15:04.975664 4776 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 08 09:15:04 crc kubenswrapper[4776]: I1208 09:15:04.975743 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="c27c1242-5109-4547-8276-2dea60fad775" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 08 09:15:11 crc kubenswrapper[4776]: I1208 09:15:11.399544 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:15:11 crc kubenswrapper[4776]: I1208 09:15:11.400323 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:15:11 crc kubenswrapper[4776]: I1208 09:15:11.400393 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 09:15:11 crc kubenswrapper[4776]: I1208 09:15:11.401347 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"be636680726361907bd5f0d2d58d00dbbd0c77d0144025e4fa0b6101666966a8"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 09:15:11 crc kubenswrapper[4776]: I1208 09:15:11.401444 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://be636680726361907bd5f0d2d58d00dbbd0c77d0144025e4fa0b6101666966a8" gracePeriod=600 Dec 08 09:15:11 crc kubenswrapper[4776]: I1208 09:15:11.562884 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="be636680726361907bd5f0d2d58d00dbbd0c77d0144025e4fa0b6101666966a8" exitCode=0 Dec 08 09:15:11 crc kubenswrapper[4776]: I1208 09:15:11.562980 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"be636680726361907bd5f0d2d58d00dbbd0c77d0144025e4fa0b6101666966a8"} Dec 08 09:15:11 crc kubenswrapper[4776]: I1208 09:15:11.563022 4776 scope.go:117] "RemoveContainer" containerID="abac38e42f2fdbb7423dde9370109f19a92ff63c4313fd19999ad68bdb72ed2b" Dec 08 09:15:12 crc kubenswrapper[4776]: I1208 09:15:12.574423 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"409acf0371b6644dc04fbcc1653de1b5f75319f5fcc98f856ec232671ed68b71"} Dec 08 09:15:14 crc kubenswrapper[4776]: I1208 09:15:14.978839 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.151456 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-jpzq9"] Dec 08 09:15:34 crc kubenswrapper[4776]: E1208 09:15:34.152506 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce3381a-0bc7-4098-9c55-87bba4519ad8" containerName="collect-profiles" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.152527 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce3381a-0bc7-4098-9c55-87bba4519ad8" containerName="collect-profiles" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.152745 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce3381a-0bc7-4098-9c55-87bba4519ad8" containerName="collect-profiles" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.153651 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.156997 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-gwsb5" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.162428 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.162790 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.162969 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.164938 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.170898 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.174955 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-jpzq9"] Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.207249 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-jpzq9"] Dec 08 09:15:34 crc kubenswrapper[4776]: E1208 09:15:34.207770 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-ttfcw metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-jpzq9" podUID="4619629f-223c-470d-a27b-25000aff6b8a" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.259370 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-trusted-ca\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.259456 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-collector-token\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.259477 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4619629f-223c-470d-a27b-25000aff6b8a-tmp\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.259499 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/4619629f-223c-470d-a27b-25000aff6b8a-datadir\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.259544 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-config\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.259562 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/4619629f-223c-470d-a27b-25000aff6b8a-sa-token\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.259760 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttfcw\" (UniqueName: \"kubernetes.io/projected/4619629f-223c-470d-a27b-25000aff6b8a-kube-api-access-ttfcw\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.259860 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-collector-syslog-receiver\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.259903 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-metrics\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.259932 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-config-openshift-service-cacrt\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.259973 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-entrypoint\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.361759 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-trusted-ca\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.362097 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-collector-token\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.362122 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4619629f-223c-470d-a27b-25000aff6b8a-tmp\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.362143 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/4619629f-223c-470d-a27b-25000aff6b8a-datadir\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.362204 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-config\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.362224 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/4619629f-223c-470d-a27b-25000aff6b8a-sa-token\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.362243 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttfcw\" (UniqueName: \"kubernetes.io/projected/4619629f-223c-470d-a27b-25000aff6b8a-kube-api-access-ttfcw\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.362245 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/4619629f-223c-470d-a27b-25000aff6b8a-datadir\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.362266 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-collector-syslog-receiver\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.362287 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-metrics\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: E1208 09:15:34.362357 4776 secret.go:188] Couldn't get secret openshift-logging/collector-syslog-receiver: secret "collector-syslog-receiver" not found Dec 08 09:15:34 crc kubenswrapper[4776]: E1208 09:15:34.362425 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-collector-syslog-receiver podName:4619629f-223c-470d-a27b-25000aff6b8a nodeName:}" failed. No retries permitted until 2025-12-08 09:15:34.862406707 +0000 UTC m=+1011.125631729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "collector-syslog-receiver" (UniqueName: "kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-collector-syslog-receiver") pod "collector-jpzq9" (UID: "4619629f-223c-470d-a27b-25000aff6b8a") : secret "collector-syslog-receiver" not found Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.362663 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-config-openshift-service-cacrt\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.362704 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-entrypoint\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.363266 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-trusted-ca\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.363293 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-config\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.363479 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-config-openshift-service-cacrt\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.363652 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-entrypoint\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.368621 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4619629f-223c-470d-a27b-25000aff6b8a-tmp\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.371303 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-metrics\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.375791 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-collector-token\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.386675 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/4619629f-223c-470d-a27b-25000aff6b8a-sa-token\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.392525 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttfcw\" (UniqueName: \"kubernetes.io/projected/4619629f-223c-470d-a27b-25000aff6b8a-kube-api-access-ttfcw\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.769482 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.780634 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.870921 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-collector-syslog-receiver\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.874107 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-collector-syslog-receiver\") pod \"collector-jpzq9\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " pod="openshift-logging/collector-jpzq9" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.971749 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttfcw\" (UniqueName: \"kubernetes.io/projected/4619629f-223c-470d-a27b-25000aff6b8a-kube-api-access-ttfcw\") pod \"4619629f-223c-470d-a27b-25000aff6b8a\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.971801 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-metrics\") pod \"4619629f-223c-470d-a27b-25000aff6b8a\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.971865 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/4619629f-223c-470d-a27b-25000aff6b8a-sa-token\") pod \"4619629f-223c-470d-a27b-25000aff6b8a\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.971889 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-collector-syslog-receiver\") pod \"4619629f-223c-470d-a27b-25000aff6b8a\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.971918 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/4619629f-223c-470d-a27b-25000aff6b8a-datadir\") pod \"4619629f-223c-470d-a27b-25000aff6b8a\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.971938 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-config\") pod \"4619629f-223c-470d-a27b-25000aff6b8a\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.971984 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-trusted-ca\") pod \"4619629f-223c-470d-a27b-25000aff6b8a\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.972025 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4619629f-223c-470d-a27b-25000aff6b8a-tmp\") pod \"4619629f-223c-470d-a27b-25000aff6b8a\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.972058 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-config-openshift-service-cacrt\") pod \"4619629f-223c-470d-a27b-25000aff6b8a\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.972104 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-collector-token\") pod \"4619629f-223c-470d-a27b-25000aff6b8a\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.972116 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4619629f-223c-470d-a27b-25000aff6b8a-datadir" (OuterVolumeSpecName: "datadir") pod "4619629f-223c-470d-a27b-25000aff6b8a" (UID: "4619629f-223c-470d-a27b-25000aff6b8a"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.972195 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-entrypoint\") pod \"4619629f-223c-470d-a27b-25000aff6b8a\" (UID: \"4619629f-223c-470d-a27b-25000aff6b8a\") " Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.972571 4776 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/4619629f-223c-470d-a27b-25000aff6b8a-datadir\") on node \"crc\" DevicePath \"\"" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.973557 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "4619629f-223c-470d-a27b-25000aff6b8a" (UID: "4619629f-223c-470d-a27b-25000aff6b8a"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.973660 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-config" (OuterVolumeSpecName: "config") pod "4619629f-223c-470d-a27b-25000aff6b8a" (UID: "4619629f-223c-470d-a27b-25000aff6b8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.973678 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "4619629f-223c-470d-a27b-25000aff6b8a" (UID: "4619629f-223c-470d-a27b-25000aff6b8a"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.973698 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4619629f-223c-470d-a27b-25000aff6b8a" (UID: "4619629f-223c-470d-a27b-25000aff6b8a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.975632 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4619629f-223c-470d-a27b-25000aff6b8a-sa-token" (OuterVolumeSpecName: "sa-token") pod "4619629f-223c-470d-a27b-25000aff6b8a" (UID: "4619629f-223c-470d-a27b-25000aff6b8a"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.975760 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-metrics" (OuterVolumeSpecName: "metrics") pod "4619629f-223c-470d-a27b-25000aff6b8a" (UID: "4619629f-223c-470d-a27b-25000aff6b8a"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.976212 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-collector-token" (OuterVolumeSpecName: "collector-token") pod "4619629f-223c-470d-a27b-25000aff6b8a" (UID: "4619629f-223c-470d-a27b-25000aff6b8a"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.976232 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "4619629f-223c-470d-a27b-25000aff6b8a" (UID: "4619629f-223c-470d-a27b-25000aff6b8a"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.976855 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4619629f-223c-470d-a27b-25000aff6b8a-kube-api-access-ttfcw" (OuterVolumeSpecName: "kube-api-access-ttfcw") pod "4619629f-223c-470d-a27b-25000aff6b8a" (UID: "4619629f-223c-470d-a27b-25000aff6b8a"). InnerVolumeSpecName "kube-api-access-ttfcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:15:34 crc kubenswrapper[4776]: I1208 09:15:34.976995 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4619629f-223c-470d-a27b-25000aff6b8a-tmp" (OuterVolumeSpecName: "tmp") pod "4619629f-223c-470d-a27b-25000aff6b8a" (UID: "4619629f-223c-470d-a27b-25000aff6b8a"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.074185 4776 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/4619629f-223c-470d-a27b-25000aff6b8a-sa-token\") on node \"crc\" DevicePath \"\"" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.074216 4776 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.074230 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.074240 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.074248 4776 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4619629f-223c-470d-a27b-25000aff6b8a-tmp\") on node \"crc\" DevicePath \"\"" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.074256 4776 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.074265 4776 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-collector-token\") on node \"crc\" DevicePath \"\"" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.074276 4776 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/4619629f-223c-470d-a27b-25000aff6b8a-entrypoint\") on node \"crc\" DevicePath \"\"" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.074286 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttfcw\" (UniqueName: \"kubernetes.io/projected/4619629f-223c-470d-a27b-25000aff6b8a-kube-api-access-ttfcw\") on node \"crc\" DevicePath \"\"" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.074297 4776 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/4619629f-223c-470d-a27b-25000aff6b8a-metrics\") on node \"crc\" DevicePath \"\"" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.779764 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-jpzq9" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.871832 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-jpzq9"] Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.882622 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-jpzq9"] Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.889369 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-2xfbc"] Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.890593 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-2xfbc" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.895103 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.895558 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-gwsb5" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.895798 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.895959 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.896237 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.896287 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-2xfbc"] Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.903421 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.988149 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/847cd111-98c5-4c39-bc29-1ba2bcdf570c-entrypoint\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.988212 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/847cd111-98c5-4c39-bc29-1ba2bcdf570c-config-openshift-service-cacrt\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.988334 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/847cd111-98c5-4c39-bc29-1ba2bcdf570c-tmp\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.988428 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/847cd111-98c5-4c39-bc29-1ba2bcdf570c-metrics\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.988450 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/847cd111-98c5-4c39-bc29-1ba2bcdf570c-datadir\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.988473 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/847cd111-98c5-4c39-bc29-1ba2bcdf570c-collector-syslog-receiver\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.988491 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/847cd111-98c5-4c39-bc29-1ba2bcdf570c-config\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.988682 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbbdc\" (UniqueName: \"kubernetes.io/projected/847cd111-98c5-4c39-bc29-1ba2bcdf570c-kube-api-access-xbbdc\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.988846 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/847cd111-98c5-4c39-bc29-1ba2bcdf570c-trusted-ca\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.988876 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/847cd111-98c5-4c39-bc29-1ba2bcdf570c-collector-token\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:35 crc kubenswrapper[4776]: I1208 09:15:35.988917 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/847cd111-98c5-4c39-bc29-1ba2bcdf570c-sa-token\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.090053 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/847cd111-98c5-4c39-bc29-1ba2bcdf570c-trusted-ca\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.090396 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/847cd111-98c5-4c39-bc29-1ba2bcdf570c-collector-token\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.090487 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/847cd111-98c5-4c39-bc29-1ba2bcdf570c-sa-token\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.090519 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/847cd111-98c5-4c39-bc29-1ba2bcdf570c-entrypoint\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.090545 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/847cd111-98c5-4c39-bc29-1ba2bcdf570c-config-openshift-service-cacrt\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.090578 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/847cd111-98c5-4c39-bc29-1ba2bcdf570c-tmp\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.090611 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/847cd111-98c5-4c39-bc29-1ba2bcdf570c-metrics\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.090636 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/847cd111-98c5-4c39-bc29-1ba2bcdf570c-datadir\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.090664 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/847cd111-98c5-4c39-bc29-1ba2bcdf570c-collector-syslog-receiver\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.090684 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/847cd111-98c5-4c39-bc29-1ba2bcdf570c-config\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.090746 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbbdc\" (UniqueName: \"kubernetes.io/projected/847cd111-98c5-4c39-bc29-1ba2bcdf570c-kube-api-access-xbbdc\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.091011 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/847cd111-98c5-4c39-bc29-1ba2bcdf570c-trusted-ca\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.091009 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/847cd111-98c5-4c39-bc29-1ba2bcdf570c-datadir\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.091117 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/847cd111-98c5-4c39-bc29-1ba2bcdf570c-config-openshift-service-cacrt\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.091853 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/847cd111-98c5-4c39-bc29-1ba2bcdf570c-entrypoint\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.092402 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/847cd111-98c5-4c39-bc29-1ba2bcdf570c-config\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.094371 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/847cd111-98c5-4c39-bc29-1ba2bcdf570c-tmp\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.094447 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/847cd111-98c5-4c39-bc29-1ba2bcdf570c-metrics\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.095247 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/847cd111-98c5-4c39-bc29-1ba2bcdf570c-collector-syslog-receiver\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.095454 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/847cd111-98c5-4c39-bc29-1ba2bcdf570c-collector-token\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.118262 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbbdc\" (UniqueName: \"kubernetes.io/projected/847cd111-98c5-4c39-bc29-1ba2bcdf570c-kube-api-access-xbbdc\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.119516 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/847cd111-98c5-4c39-bc29-1ba2bcdf570c-sa-token\") pod \"collector-2xfbc\" (UID: \"847cd111-98c5-4c39-bc29-1ba2bcdf570c\") " pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.208949 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-2xfbc" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.378952 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4619629f-223c-470d-a27b-25000aff6b8a" path="/var/lib/kubelet/pods/4619629f-223c-470d-a27b-25000aff6b8a/volumes" Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.702106 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-2xfbc"] Dec 08 09:15:36 crc kubenswrapper[4776]: I1208 09:15:36.787538 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-2xfbc" event={"ID":"847cd111-98c5-4c39-bc29-1ba2bcdf570c","Type":"ContainerStarted","Data":"0674d97592a93cf8a7689c2217dccea1d2943e941b9d8c6da69d5c6b494a6536"} Dec 08 09:15:43 crc kubenswrapper[4776]: I1208 09:15:43.837764 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-2xfbc" event={"ID":"847cd111-98c5-4c39-bc29-1ba2bcdf570c","Type":"ContainerStarted","Data":"e2c46aadd12db1aad3728195746b1b3432fb7c2e1a292fa6b3fe062608502891"} Dec 08 09:15:43 crc kubenswrapper[4776]: I1208 09:15:43.874204 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-2xfbc" podStartSLOduration=2.687998694 podStartE2EDuration="8.874149192s" podCreationTimestamp="2025-12-08 09:15:35 +0000 UTC" firstStartedPulling="2025-12-08 09:15:36.72308962 +0000 UTC m=+1012.986314682" lastFinishedPulling="2025-12-08 09:15:42.909240148 +0000 UTC m=+1019.172465180" observedRunningTime="2025-12-08 09:15:43.872080186 +0000 UTC m=+1020.135305508" watchObservedRunningTime="2025-12-08 09:15:43.874149192 +0000 UTC m=+1020.137374254" Dec 08 09:16:14 crc kubenswrapper[4776]: I1208 09:16:14.482584 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q"] Dec 08 09:16:14 crc kubenswrapper[4776]: I1208 09:16:14.485218 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" Dec 08 09:16:14 crc kubenswrapper[4776]: I1208 09:16:14.487140 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 08 09:16:14 crc kubenswrapper[4776]: I1208 09:16:14.491555 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q"] Dec 08 09:16:14 crc kubenswrapper[4776]: I1208 09:16:14.526254 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8567f1db-9f8a-49aa-8864-e18aef8b18e7-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q\" (UID: \"8567f1db-9f8a-49aa-8864-e18aef8b18e7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" Dec 08 09:16:14 crc kubenswrapper[4776]: I1208 09:16:14.526325 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8567f1db-9f8a-49aa-8864-e18aef8b18e7-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q\" (UID: \"8567f1db-9f8a-49aa-8864-e18aef8b18e7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" Dec 08 09:16:14 crc kubenswrapper[4776]: I1208 09:16:14.526502 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nld8z\" (UniqueName: \"kubernetes.io/projected/8567f1db-9f8a-49aa-8864-e18aef8b18e7-kube-api-access-nld8z\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q\" (UID: \"8567f1db-9f8a-49aa-8864-e18aef8b18e7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" Dec 08 09:16:14 crc kubenswrapper[4776]: I1208 09:16:14.628371 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8567f1db-9f8a-49aa-8864-e18aef8b18e7-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q\" (UID: \"8567f1db-9f8a-49aa-8864-e18aef8b18e7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" Dec 08 09:16:14 crc kubenswrapper[4776]: I1208 09:16:14.628443 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8567f1db-9f8a-49aa-8864-e18aef8b18e7-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q\" (UID: \"8567f1db-9f8a-49aa-8864-e18aef8b18e7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" Dec 08 09:16:14 crc kubenswrapper[4776]: I1208 09:16:14.628506 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nld8z\" (UniqueName: \"kubernetes.io/projected/8567f1db-9f8a-49aa-8864-e18aef8b18e7-kube-api-access-nld8z\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q\" (UID: \"8567f1db-9f8a-49aa-8864-e18aef8b18e7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" Dec 08 09:16:14 crc kubenswrapper[4776]: I1208 09:16:14.629315 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8567f1db-9f8a-49aa-8864-e18aef8b18e7-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q\" (UID: \"8567f1db-9f8a-49aa-8864-e18aef8b18e7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" Dec 08 09:16:14 crc kubenswrapper[4776]: I1208 09:16:14.629571 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8567f1db-9f8a-49aa-8864-e18aef8b18e7-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q\" (UID: \"8567f1db-9f8a-49aa-8864-e18aef8b18e7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" Dec 08 09:16:14 crc kubenswrapper[4776]: I1208 09:16:14.650027 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nld8z\" (UniqueName: \"kubernetes.io/projected/8567f1db-9f8a-49aa-8864-e18aef8b18e7-kube-api-access-nld8z\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q\" (UID: \"8567f1db-9f8a-49aa-8864-e18aef8b18e7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" Dec 08 09:16:14 crc kubenswrapper[4776]: I1208 09:16:14.807357 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" Dec 08 09:16:15 crc kubenswrapper[4776]: I1208 09:16:15.314410 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q"] Dec 08 09:16:16 crc kubenswrapper[4776]: I1208 09:16:16.113017 4776 generic.go:334] "Generic (PLEG): container finished" podID="8567f1db-9f8a-49aa-8864-e18aef8b18e7" containerID="931ab4b538fa825bdd263675ea9a41c1b5440a23dd3ed1be4290de1473f4f1a4" exitCode=0 Dec 08 09:16:16 crc kubenswrapper[4776]: I1208 09:16:16.113066 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" event={"ID":"8567f1db-9f8a-49aa-8864-e18aef8b18e7","Type":"ContainerDied","Data":"931ab4b538fa825bdd263675ea9a41c1b5440a23dd3ed1be4290de1473f4f1a4"} Dec 08 09:16:16 crc kubenswrapper[4776]: I1208 09:16:16.113096 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" event={"ID":"8567f1db-9f8a-49aa-8864-e18aef8b18e7","Type":"ContainerStarted","Data":"d0fd06517c17553688377f3d7d44a70a4cc236d4fdd0c9d2af4a06d76de09e6f"} Dec 08 09:16:18 crc kubenswrapper[4776]: I1208 09:16:18.132100 4776 generic.go:334] "Generic (PLEG): container finished" podID="8567f1db-9f8a-49aa-8864-e18aef8b18e7" containerID="d03e67c2de001317f3d74b3d5ccba2757c6d797d42f5670f454e02774db24f02" exitCode=0 Dec 08 09:16:18 crc kubenswrapper[4776]: I1208 09:16:18.132806 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" event={"ID":"8567f1db-9f8a-49aa-8864-e18aef8b18e7","Type":"ContainerDied","Data":"d03e67c2de001317f3d74b3d5ccba2757c6d797d42f5670f454e02774db24f02"} Dec 08 09:16:19 crc kubenswrapper[4776]: I1208 09:16:19.140560 4776 generic.go:334] "Generic (PLEG): container finished" podID="8567f1db-9f8a-49aa-8864-e18aef8b18e7" containerID="d671025fdbe2a1bab5547d7bc5701188624344c7381a04012e3e81795e55a16a" exitCode=0 Dec 08 09:16:19 crc kubenswrapper[4776]: I1208 09:16:19.140631 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" event={"ID":"8567f1db-9f8a-49aa-8864-e18aef8b18e7","Type":"ContainerDied","Data":"d671025fdbe2a1bab5547d7bc5701188624344c7381a04012e3e81795e55a16a"} Dec 08 09:16:20 crc kubenswrapper[4776]: I1208 09:16:20.440497 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" Dec 08 09:16:20 crc kubenswrapper[4776]: I1208 09:16:20.523426 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8567f1db-9f8a-49aa-8864-e18aef8b18e7-bundle\") pod \"8567f1db-9f8a-49aa-8864-e18aef8b18e7\" (UID: \"8567f1db-9f8a-49aa-8864-e18aef8b18e7\") " Dec 08 09:16:20 crc kubenswrapper[4776]: I1208 09:16:20.523549 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8567f1db-9f8a-49aa-8864-e18aef8b18e7-util\") pod \"8567f1db-9f8a-49aa-8864-e18aef8b18e7\" (UID: \"8567f1db-9f8a-49aa-8864-e18aef8b18e7\") " Dec 08 09:16:20 crc kubenswrapper[4776]: I1208 09:16:20.523609 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nld8z\" (UniqueName: \"kubernetes.io/projected/8567f1db-9f8a-49aa-8864-e18aef8b18e7-kube-api-access-nld8z\") pod \"8567f1db-9f8a-49aa-8864-e18aef8b18e7\" (UID: \"8567f1db-9f8a-49aa-8864-e18aef8b18e7\") " Dec 08 09:16:20 crc kubenswrapper[4776]: I1208 09:16:20.523970 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8567f1db-9f8a-49aa-8864-e18aef8b18e7-bundle" (OuterVolumeSpecName: "bundle") pod "8567f1db-9f8a-49aa-8864-e18aef8b18e7" (UID: "8567f1db-9f8a-49aa-8864-e18aef8b18e7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:16:20 crc kubenswrapper[4776]: I1208 09:16:20.524227 4776 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8567f1db-9f8a-49aa-8864-e18aef8b18e7-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:16:20 crc kubenswrapper[4776]: I1208 09:16:20.528448 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8567f1db-9f8a-49aa-8864-e18aef8b18e7-kube-api-access-nld8z" (OuterVolumeSpecName: "kube-api-access-nld8z") pod "8567f1db-9f8a-49aa-8864-e18aef8b18e7" (UID: "8567f1db-9f8a-49aa-8864-e18aef8b18e7"). InnerVolumeSpecName "kube-api-access-nld8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:16:20 crc kubenswrapper[4776]: I1208 09:16:20.538856 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8567f1db-9f8a-49aa-8864-e18aef8b18e7-util" (OuterVolumeSpecName: "util") pod "8567f1db-9f8a-49aa-8864-e18aef8b18e7" (UID: "8567f1db-9f8a-49aa-8864-e18aef8b18e7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:16:20 crc kubenswrapper[4776]: I1208 09:16:20.643668 4776 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8567f1db-9f8a-49aa-8864-e18aef8b18e7-util\") on node \"crc\" DevicePath \"\"" Dec 08 09:16:20 crc kubenswrapper[4776]: I1208 09:16:20.643720 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nld8z\" (UniqueName: \"kubernetes.io/projected/8567f1db-9f8a-49aa-8864-e18aef8b18e7-kube-api-access-nld8z\") on node \"crc\" DevicePath \"\"" Dec 08 09:16:21 crc kubenswrapper[4776]: I1208 09:16:21.160353 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" event={"ID":"8567f1db-9f8a-49aa-8864-e18aef8b18e7","Type":"ContainerDied","Data":"d0fd06517c17553688377f3d7d44a70a4cc236d4fdd0c9d2af4a06d76de09e6f"} Dec 08 09:16:21 crc kubenswrapper[4776]: I1208 09:16:21.160643 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0fd06517c17553688377f3d7d44a70a4cc236d4fdd0c9d2af4a06d76de09e6f" Dec 08 09:16:21 crc kubenswrapper[4776]: I1208 09:16:21.160517 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q" Dec 08 09:16:26 crc kubenswrapper[4776]: I1208 09:16:26.424454 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-n9gb5"] Dec 08 09:16:26 crc kubenswrapper[4776]: E1208 09:16:26.425338 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8567f1db-9f8a-49aa-8864-e18aef8b18e7" containerName="util" Dec 08 09:16:26 crc kubenswrapper[4776]: I1208 09:16:26.425354 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8567f1db-9f8a-49aa-8864-e18aef8b18e7" containerName="util" Dec 08 09:16:26 crc kubenswrapper[4776]: E1208 09:16:26.425371 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8567f1db-9f8a-49aa-8864-e18aef8b18e7" containerName="extract" Dec 08 09:16:26 crc kubenswrapper[4776]: I1208 09:16:26.425378 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8567f1db-9f8a-49aa-8864-e18aef8b18e7" containerName="extract" Dec 08 09:16:26 crc kubenswrapper[4776]: E1208 09:16:26.425403 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8567f1db-9f8a-49aa-8864-e18aef8b18e7" containerName="pull" Dec 08 09:16:26 crc kubenswrapper[4776]: I1208 09:16:26.425411 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8567f1db-9f8a-49aa-8864-e18aef8b18e7" containerName="pull" Dec 08 09:16:26 crc kubenswrapper[4776]: I1208 09:16:26.425567 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8567f1db-9f8a-49aa-8864-e18aef8b18e7" containerName="extract" Dec 08 09:16:26 crc kubenswrapper[4776]: I1208 09:16:26.426169 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n9gb5" Dec 08 09:16:26 crc kubenswrapper[4776]: I1208 09:16:26.428350 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-gxq6r" Dec 08 09:16:26 crc kubenswrapper[4776]: I1208 09:16:26.429862 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 08 09:16:26 crc kubenswrapper[4776]: I1208 09:16:26.434331 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 08 09:16:26 crc kubenswrapper[4776]: I1208 09:16:26.440912 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-n9gb5"] Dec 08 09:16:26 crc kubenswrapper[4776]: I1208 09:16:26.548096 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvzjv\" (UniqueName: \"kubernetes.io/projected/5322a22f-cb6b-45df-af5f-395b2180a64b-kube-api-access-cvzjv\") pod \"nmstate-operator-5b5b58f5c8-n9gb5\" (UID: \"5322a22f-cb6b-45df-af5f-395b2180a64b\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n9gb5" Dec 08 09:16:26 crc kubenswrapper[4776]: I1208 09:16:26.649805 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvzjv\" (UniqueName: \"kubernetes.io/projected/5322a22f-cb6b-45df-af5f-395b2180a64b-kube-api-access-cvzjv\") pod \"nmstate-operator-5b5b58f5c8-n9gb5\" (UID: \"5322a22f-cb6b-45df-af5f-395b2180a64b\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n9gb5" Dec 08 09:16:26 crc kubenswrapper[4776]: I1208 09:16:26.687491 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvzjv\" (UniqueName: \"kubernetes.io/projected/5322a22f-cb6b-45df-af5f-395b2180a64b-kube-api-access-cvzjv\") pod \"nmstate-operator-5b5b58f5c8-n9gb5\" (UID: \"5322a22f-cb6b-45df-af5f-395b2180a64b\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n9gb5" Dec 08 09:16:26 crc kubenswrapper[4776]: I1208 09:16:26.746773 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n9gb5" Dec 08 09:16:27 crc kubenswrapper[4776]: I1208 09:16:27.254560 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-n9gb5"] Dec 08 09:16:28 crc kubenswrapper[4776]: I1208 09:16:28.205574 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n9gb5" event={"ID":"5322a22f-cb6b-45df-af5f-395b2180a64b","Type":"ContainerStarted","Data":"577bfcf079efaaec12584402a69a05347e79651c60e2845c5e35149fe3ea985b"} Dec 08 09:16:30 crc kubenswrapper[4776]: I1208 09:16:30.221093 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n9gb5" event={"ID":"5322a22f-cb6b-45df-af5f-395b2180a64b","Type":"ContainerStarted","Data":"1abde7a6196e0ad8f65c5e09849e6c4636e8c5f4fbbfdfaf61f4b306c743361c"} Dec 08 09:16:30 crc kubenswrapper[4776]: I1208 09:16:30.245044 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n9gb5" podStartSLOduration=2.065260173 podStartE2EDuration="4.245021835s" podCreationTimestamp="2025-12-08 09:16:26 +0000 UTC" firstStartedPulling="2025-12-08 09:16:27.263983784 +0000 UTC m=+1063.527208806" lastFinishedPulling="2025-12-08 09:16:29.443745446 +0000 UTC m=+1065.706970468" observedRunningTime="2025-12-08 09:16:30.240496142 +0000 UTC m=+1066.503721164" watchObservedRunningTime="2025-12-08 09:16:30.245021835 +0000 UTC m=+1066.508246857" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.703827 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-lpbj6"] Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.707667 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lpbj6" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.708818 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-69dpx"] Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.709706 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-69dpx" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.710359 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-7c5dr" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.719458 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.722303 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-lpbj6"] Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.733354 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-69dpx"] Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.762776 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-b8ckr"] Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.764298 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-b8ckr" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.815524 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd6j2\" (UniqueName: \"kubernetes.io/projected/b26e1190-f68a-487b-a2de-e0116525ab64-kube-api-access-gd6j2\") pod \"nmstate-metrics-7f946cbc9-lpbj6\" (UID: \"b26e1190-f68a-487b-a2de-e0116525ab64\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lpbj6" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.815797 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dngg\" (UniqueName: \"kubernetes.io/projected/e59b99c1-c9b8-4127-93db-933dddb3ebab-kube-api-access-9dngg\") pod \"nmstate-webhook-5f6d4c5ccb-69dpx\" (UID: \"e59b99c1-c9b8-4127-93db-933dddb3ebab\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-69dpx" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.815953 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e59b99c1-c9b8-4127-93db-933dddb3ebab-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-69dpx\" (UID: \"e59b99c1-c9b8-4127-93db-933dddb3ebab\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-69dpx" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.855559 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l"] Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.857674 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.859605 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-dpmr2" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.860263 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.860417 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.870289 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l"] Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.918215 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5e0ef761-506d-4695-b58a-128a6f5f7957-nmstate-lock\") pod \"nmstate-handler-b8ckr\" (UID: \"5e0ef761-506d-4695-b58a-128a6f5f7957\") " pod="openshift-nmstate/nmstate-handler-b8ckr" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.918265 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5e0ef761-506d-4695-b58a-128a6f5f7957-dbus-socket\") pod \"nmstate-handler-b8ckr\" (UID: \"5e0ef761-506d-4695-b58a-128a6f5f7957\") " pod="openshift-nmstate/nmstate-handler-b8ckr" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.918295 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd6j2\" (UniqueName: \"kubernetes.io/projected/b26e1190-f68a-487b-a2de-e0116525ab64-kube-api-access-gd6j2\") pod \"nmstate-metrics-7f946cbc9-lpbj6\" (UID: \"b26e1190-f68a-487b-a2de-e0116525ab64\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lpbj6" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.918319 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5e0ef761-506d-4695-b58a-128a6f5f7957-ovs-socket\") pod \"nmstate-handler-b8ckr\" (UID: \"5e0ef761-506d-4695-b58a-128a6f5f7957\") " pod="openshift-nmstate/nmstate-handler-b8ckr" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.918345 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dngg\" (UniqueName: \"kubernetes.io/projected/e59b99c1-c9b8-4127-93db-933dddb3ebab-kube-api-access-9dngg\") pod \"nmstate-webhook-5f6d4c5ccb-69dpx\" (UID: \"e59b99c1-c9b8-4127-93db-933dddb3ebab\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-69dpx" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.918370 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e59b99c1-c9b8-4127-93db-933dddb3ebab-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-69dpx\" (UID: \"e59b99c1-c9b8-4127-93db-933dddb3ebab\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-69dpx" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.918420 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grhkp\" (UniqueName: \"kubernetes.io/projected/5e0ef761-506d-4695-b58a-128a6f5f7957-kube-api-access-grhkp\") pod \"nmstate-handler-b8ckr\" (UID: \"5e0ef761-506d-4695-b58a-128a6f5f7957\") " pod="openshift-nmstate/nmstate-handler-b8ckr" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.924993 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e59b99c1-c9b8-4127-93db-933dddb3ebab-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-69dpx\" (UID: \"e59b99c1-c9b8-4127-93db-933dddb3ebab\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-69dpx" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.937729 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dngg\" (UniqueName: \"kubernetes.io/projected/e59b99c1-c9b8-4127-93db-933dddb3ebab-kube-api-access-9dngg\") pod \"nmstate-webhook-5f6d4c5ccb-69dpx\" (UID: \"e59b99c1-c9b8-4127-93db-933dddb3ebab\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-69dpx" Dec 08 09:16:35 crc kubenswrapper[4776]: I1208 09:16:35.939184 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd6j2\" (UniqueName: \"kubernetes.io/projected/b26e1190-f68a-487b-a2de-e0116525ab64-kube-api-access-gd6j2\") pod \"nmstate-metrics-7f946cbc9-lpbj6\" (UID: \"b26e1190-f68a-487b-a2de-e0116525ab64\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lpbj6" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.020223 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5e0ef761-506d-4695-b58a-128a6f5f7957-nmstate-lock\") pod \"nmstate-handler-b8ckr\" (UID: \"5e0ef761-506d-4695-b58a-128a6f5f7957\") " pod="openshift-nmstate/nmstate-handler-b8ckr" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.020269 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5e0ef761-506d-4695-b58a-128a6f5f7957-dbus-socket\") pod \"nmstate-handler-b8ckr\" (UID: \"5e0ef761-506d-4695-b58a-128a6f5f7957\") " pod="openshift-nmstate/nmstate-handler-b8ckr" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.020306 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c2ba126f-aa28-4cfa-8aed-a9221e094a58-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-8ds4l\" (UID: \"c2ba126f-aa28-4cfa-8aed-a9221e094a58\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.020326 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5e0ef761-506d-4695-b58a-128a6f5f7957-ovs-socket\") pod \"nmstate-handler-b8ckr\" (UID: \"5e0ef761-506d-4695-b58a-128a6f5f7957\") " pod="openshift-nmstate/nmstate-handler-b8ckr" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.020353 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67fdh\" (UniqueName: \"kubernetes.io/projected/c2ba126f-aa28-4cfa-8aed-a9221e094a58-kube-api-access-67fdh\") pod \"nmstate-console-plugin-7fbb5f6569-8ds4l\" (UID: \"c2ba126f-aa28-4cfa-8aed-a9221e094a58\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.020394 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ba126f-aa28-4cfa-8aed-a9221e094a58-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-8ds4l\" (UID: \"c2ba126f-aa28-4cfa-8aed-a9221e094a58\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.020389 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5e0ef761-506d-4695-b58a-128a6f5f7957-nmstate-lock\") pod \"nmstate-handler-b8ckr\" (UID: \"5e0ef761-506d-4695-b58a-128a6f5f7957\") " pod="openshift-nmstate/nmstate-handler-b8ckr" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.020426 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grhkp\" (UniqueName: \"kubernetes.io/projected/5e0ef761-506d-4695-b58a-128a6f5f7957-kube-api-access-grhkp\") pod \"nmstate-handler-b8ckr\" (UID: \"5e0ef761-506d-4695-b58a-128a6f5f7957\") " pod="openshift-nmstate/nmstate-handler-b8ckr" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.020462 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5e0ef761-506d-4695-b58a-128a6f5f7957-ovs-socket\") pod \"nmstate-handler-b8ckr\" (UID: \"5e0ef761-506d-4695-b58a-128a6f5f7957\") " pod="openshift-nmstate/nmstate-handler-b8ckr" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.020695 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5e0ef761-506d-4695-b58a-128a6f5f7957-dbus-socket\") pod \"nmstate-handler-b8ckr\" (UID: \"5e0ef761-506d-4695-b58a-128a6f5f7957\") " pod="openshift-nmstate/nmstate-handler-b8ckr" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.036007 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lpbj6" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.043122 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grhkp\" (UniqueName: \"kubernetes.io/projected/5e0ef761-506d-4695-b58a-128a6f5f7957-kube-api-access-grhkp\") pod \"nmstate-handler-b8ckr\" (UID: \"5e0ef761-506d-4695-b58a-128a6f5f7957\") " pod="openshift-nmstate/nmstate-handler-b8ckr" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.046628 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-69dpx" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.050087 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-588757d595-b54s9"] Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.050983 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.070892 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-588757d595-b54s9"] Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.082574 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-b8ckr" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.121929 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c2ba126f-aa28-4cfa-8aed-a9221e094a58-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-8ds4l\" (UID: \"c2ba126f-aa28-4cfa-8aed-a9221e094a58\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.122294 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67fdh\" (UniqueName: \"kubernetes.io/projected/c2ba126f-aa28-4cfa-8aed-a9221e094a58-kube-api-access-67fdh\") pod \"nmstate-console-plugin-7fbb5f6569-8ds4l\" (UID: \"c2ba126f-aa28-4cfa-8aed-a9221e094a58\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.122332 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-trusted-ca-bundle\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.122366 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ba126f-aa28-4cfa-8aed-a9221e094a58-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-8ds4l\" (UID: \"c2ba126f-aa28-4cfa-8aed-a9221e094a58\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.122394 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-oauth-serving-cert\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.122411 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm7nd\" (UniqueName: \"kubernetes.io/projected/f6afc1e0-d554-4135-a114-4cc735150c43-kube-api-access-cm7nd\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.122441 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-console-config\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.122472 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6afc1e0-d554-4135-a114-4cc735150c43-console-serving-cert\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.122515 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6afc1e0-d554-4135-a114-4cc735150c43-console-oauth-config\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.122533 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-service-ca\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.124326 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c2ba126f-aa28-4cfa-8aed-a9221e094a58-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-8ds4l\" (UID: \"c2ba126f-aa28-4cfa-8aed-a9221e094a58\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.134883 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ba126f-aa28-4cfa-8aed-a9221e094a58-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-8ds4l\" (UID: \"c2ba126f-aa28-4cfa-8aed-a9221e094a58\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.142537 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67fdh\" (UniqueName: \"kubernetes.io/projected/c2ba126f-aa28-4cfa-8aed-a9221e094a58-kube-api-access-67fdh\") pod \"nmstate-console-plugin-7fbb5f6569-8ds4l\" (UID: \"c2ba126f-aa28-4cfa-8aed-a9221e094a58\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.173764 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.224450 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-service-ca\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.226019 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-trusted-ca-bundle\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.226256 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-oauth-serving-cert\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.226277 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm7nd\" (UniqueName: \"kubernetes.io/projected/f6afc1e0-d554-4135-a114-4cc735150c43-kube-api-access-cm7nd\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.226315 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-console-config\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.226357 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6afc1e0-d554-4135-a114-4cc735150c43-console-serving-cert\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.226408 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6afc1e0-d554-4135-a114-4cc735150c43-console-oauth-config\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.227476 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-oauth-serving-cert\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.227498 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-console-config\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.228288 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-trusted-ca-bundle\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.230302 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-service-ca\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.238960 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6afc1e0-d554-4135-a114-4cc735150c43-console-oauth-config\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.241681 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6afc1e0-d554-4135-a114-4cc735150c43-console-serving-cert\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.255795 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm7nd\" (UniqueName: \"kubernetes.io/projected/f6afc1e0-d554-4135-a114-4cc735150c43-kube-api-access-cm7nd\") pod \"console-588757d595-b54s9\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.315354 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-b8ckr" event={"ID":"5e0ef761-506d-4695-b58a-128a6f5f7957","Type":"ContainerStarted","Data":"443bff7bfc4e6610cc358560d940e29f8c3e474576fc37fc3f091bdf70879fec"} Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.429222 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-69dpx"] Dec 08 09:16:36 crc kubenswrapper[4776]: W1208 09:16:36.435539 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode59b99c1_c9b8_4127_93db_933dddb3ebab.slice/crio-2c8784a2d4fef8389f987887c10918eafecee6bcc3f18c2e3810d7fe8071a0ca WatchSource:0}: Error finding container 2c8784a2d4fef8389f987887c10918eafecee6bcc3f18c2e3810d7fe8071a0ca: Status 404 returned error can't find the container with id 2c8784a2d4fef8389f987887c10918eafecee6bcc3f18c2e3810d7fe8071a0ca Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.457642 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.495259 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-lpbj6"] Dec 08 09:16:36 crc kubenswrapper[4776]: W1208 09:16:36.501392 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb26e1190_f68a_487b_a2de_e0116525ab64.slice/crio-8e20162b277f33f6381dcab5da4a9010f73031b104a431f3126303caba41672d WatchSource:0}: Error finding container 8e20162b277f33f6381dcab5da4a9010f73031b104a431f3126303caba41672d: Status 404 returned error can't find the container with id 8e20162b277f33f6381dcab5da4a9010f73031b104a431f3126303caba41672d Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.740856 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l"] Dec 08 09:16:36 crc kubenswrapper[4776]: W1208 09:16:36.746330 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ba126f_aa28_4cfa_8aed_a9221e094a58.slice/crio-f0d3c1c5320ba862d71c5744d6bebb6ad546296b988a2fa44ad3730d2b162961 WatchSource:0}: Error finding container f0d3c1c5320ba862d71c5744d6bebb6ad546296b988a2fa44ad3730d2b162961: Status 404 returned error can't find the container with id f0d3c1c5320ba862d71c5744d6bebb6ad546296b988a2fa44ad3730d2b162961 Dec 08 09:16:36 crc kubenswrapper[4776]: I1208 09:16:36.903553 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-588757d595-b54s9"] Dec 08 09:16:36 crc kubenswrapper[4776]: W1208 09:16:36.907499 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6afc1e0_d554_4135_a114_4cc735150c43.slice/crio-0bfb45fca9e0916bc267d5cff8e5432583aa7aeab826768222041f483c3383a8 WatchSource:0}: Error finding container 0bfb45fca9e0916bc267d5cff8e5432583aa7aeab826768222041f483c3383a8: Status 404 returned error can't find the container with id 0bfb45fca9e0916bc267d5cff8e5432583aa7aeab826768222041f483c3383a8 Dec 08 09:16:37 crc kubenswrapper[4776]: I1208 09:16:37.333016 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-69dpx" event={"ID":"e59b99c1-c9b8-4127-93db-933dddb3ebab","Type":"ContainerStarted","Data":"2c8784a2d4fef8389f987887c10918eafecee6bcc3f18c2e3810d7fe8071a0ca"} Dec 08 09:16:37 crc kubenswrapper[4776]: I1208 09:16:37.334923 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lpbj6" event={"ID":"b26e1190-f68a-487b-a2de-e0116525ab64","Type":"ContainerStarted","Data":"8e20162b277f33f6381dcab5da4a9010f73031b104a431f3126303caba41672d"} Dec 08 09:16:37 crc kubenswrapper[4776]: I1208 09:16:37.336822 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-588757d595-b54s9" event={"ID":"f6afc1e0-d554-4135-a114-4cc735150c43","Type":"ContainerStarted","Data":"32928cbe8a4b53150016bc031db50b46bfbdeedf8ebddfdde86638d7d5774545"} Dec 08 09:16:37 crc kubenswrapper[4776]: I1208 09:16:37.336883 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-588757d595-b54s9" event={"ID":"f6afc1e0-d554-4135-a114-4cc735150c43","Type":"ContainerStarted","Data":"0bfb45fca9e0916bc267d5cff8e5432583aa7aeab826768222041f483c3383a8"} Dec 08 09:16:37 crc kubenswrapper[4776]: I1208 09:16:37.339724 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l" event={"ID":"c2ba126f-aa28-4cfa-8aed-a9221e094a58","Type":"ContainerStarted","Data":"f0d3c1c5320ba862d71c5744d6bebb6ad546296b988a2fa44ad3730d2b162961"} Dec 08 09:16:37 crc kubenswrapper[4776]: I1208 09:16:37.363286 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-588757d595-b54s9" podStartSLOduration=1.363054116 podStartE2EDuration="1.363054116s" podCreationTimestamp="2025-12-08 09:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:16:37.36059909 +0000 UTC m=+1073.623824102" watchObservedRunningTime="2025-12-08 09:16:37.363054116 +0000 UTC m=+1073.626279138" Dec 08 09:16:39 crc kubenswrapper[4776]: I1208 09:16:39.889522 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 09:16:40 crc kubenswrapper[4776]: I1208 09:16:40.374007 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-69dpx" event={"ID":"e59b99c1-c9b8-4127-93db-933dddb3ebab","Type":"ContainerStarted","Data":"921e4ad54c71c0e5292e63ee3e9c90e42fa193c33cf0a7282514fb21295d9858"} Dec 08 09:16:40 crc kubenswrapper[4776]: I1208 09:16:40.374355 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-69dpx" Dec 08 09:16:40 crc kubenswrapper[4776]: I1208 09:16:40.375285 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lpbj6" event={"ID":"b26e1190-f68a-487b-a2de-e0116525ab64","Type":"ContainerStarted","Data":"7d166280d25c5606a81caf83d141d14a69a9ec32dd4da1cc1a9cb19eff40ae9f"} Dec 08 09:16:40 crc kubenswrapper[4776]: I1208 09:16:40.376409 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-b8ckr" event={"ID":"5e0ef761-506d-4695-b58a-128a6f5f7957","Type":"ContainerStarted","Data":"103200332d926a71a05fec047267261a6e47cafe4e36acd3158bd3b598c571ad"} Dec 08 09:16:40 crc kubenswrapper[4776]: I1208 09:16:40.376473 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-b8ckr" Dec 08 09:16:40 crc kubenswrapper[4776]: I1208 09:16:40.377475 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l" event={"ID":"c2ba126f-aa28-4cfa-8aed-a9221e094a58","Type":"ContainerStarted","Data":"00426d2a557435f15556716fef3c2533cb8713cb9d78d1b774285e9ee44fe237"} Dec 08 09:16:40 crc kubenswrapper[4776]: I1208 09:16:40.394535 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-69dpx" podStartSLOduration=2.179571064 podStartE2EDuration="5.394511826s" podCreationTimestamp="2025-12-08 09:16:35 +0000 UTC" firstStartedPulling="2025-12-08 09:16:36.442131128 +0000 UTC m=+1072.705356150" lastFinishedPulling="2025-12-08 09:16:39.6570719 +0000 UTC m=+1075.920296912" observedRunningTime="2025-12-08 09:16:40.389863401 +0000 UTC m=+1076.653088423" watchObservedRunningTime="2025-12-08 09:16:40.394511826 +0000 UTC m=+1076.657736849" Dec 08 09:16:40 crc kubenswrapper[4776]: I1208 09:16:40.423267 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-b8ckr" podStartSLOduration=1.912093027 podStartE2EDuration="5.423250992s" podCreationTimestamp="2025-12-08 09:16:35 +0000 UTC" firstStartedPulling="2025-12-08 09:16:36.137028526 +0000 UTC m=+1072.400253558" lastFinishedPulling="2025-12-08 09:16:39.648186501 +0000 UTC m=+1075.911411523" observedRunningTime="2025-12-08 09:16:40.417457025 +0000 UTC m=+1076.680682047" watchObservedRunningTime="2025-12-08 09:16:40.423250992 +0000 UTC m=+1076.686476014" Dec 08 09:16:43 crc kubenswrapper[4776]: I1208 09:16:43.403526 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lpbj6" event={"ID":"b26e1190-f68a-487b-a2de-e0116525ab64","Type":"ContainerStarted","Data":"0f289a9edcd44aaca7c799b819a54f6672aa7bc355e28449dc12057e0632dc63"} Dec 08 09:16:43 crc kubenswrapper[4776]: I1208 09:16:43.423164 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lpbj6" podStartSLOduration=2.171181329 podStartE2EDuration="8.423144492s" podCreationTimestamp="2025-12-08 09:16:35 +0000 UTC" firstStartedPulling="2025-12-08 09:16:36.507164723 +0000 UTC m=+1072.770389745" lastFinishedPulling="2025-12-08 09:16:42.759127896 +0000 UTC m=+1079.022352908" observedRunningTime="2025-12-08 09:16:43.420580363 +0000 UTC m=+1079.683805425" watchObservedRunningTime="2025-12-08 09:16:43.423144492 +0000 UTC m=+1079.686369534" Dec 08 09:16:43 crc kubenswrapper[4776]: I1208 09:16:43.424068 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8ds4l" podStartSLOduration=5.519211981 podStartE2EDuration="8.424060716s" podCreationTimestamp="2025-12-08 09:16:35 +0000 UTC" firstStartedPulling="2025-12-08 09:16:36.749033899 +0000 UTC m=+1073.012258921" lastFinishedPulling="2025-12-08 09:16:39.653882614 +0000 UTC m=+1075.917107656" observedRunningTime="2025-12-08 09:16:40.432387579 +0000 UTC m=+1076.695612601" watchObservedRunningTime="2025-12-08 09:16:43.424060716 +0000 UTC m=+1079.687285738" Dec 08 09:16:46 crc kubenswrapper[4776]: I1208 09:16:46.118938 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-b8ckr" Dec 08 09:16:46 crc kubenswrapper[4776]: I1208 09:16:46.459228 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:46 crc kubenswrapper[4776]: I1208 09:16:46.459622 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:46 crc kubenswrapper[4776]: I1208 09:16:46.465631 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:47 crc kubenswrapper[4776]: I1208 09:16:47.443629 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-588757d595-b54s9" Dec 08 09:16:47 crc kubenswrapper[4776]: I1208 09:16:47.496508 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-796dfb4c97-zckps"] Dec 08 09:16:56 crc kubenswrapper[4776]: I1208 09:16:56.055728 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-69dpx" Dec 08 09:17:11 crc kubenswrapper[4776]: I1208 09:17:11.399687 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:17:11 crc kubenswrapper[4776]: I1208 09:17:11.400285 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:17:12 crc kubenswrapper[4776]: I1208 09:17:12.564854 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-796dfb4c97-zckps" podUID="aa22c984-c7d5-497f-b165-484bd945318c" containerName="console" containerID="cri-o://f1853047fafe7bea95051435c12647845ee744a4c7c2672f21a97874c721b856" gracePeriod=15 Dec 08 09:17:12 crc kubenswrapper[4776]: I1208 09:17:12.955094 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-796dfb4c97-zckps_aa22c984-c7d5-497f-b165-484bd945318c/console/0.log" Dec 08 09:17:12 crc kubenswrapper[4776]: I1208 09:17:12.955192 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.081835 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-oauth-serving-cert\") pod \"aa22c984-c7d5-497f-b165-484bd945318c\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.082268 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa22c984-c7d5-497f-b165-484bd945318c-console-serving-cert\") pod \"aa22c984-c7d5-497f-b165-484bd945318c\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.082305 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-service-ca\") pod \"aa22c984-c7d5-497f-b165-484bd945318c\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.082376 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-console-config\") pod \"aa22c984-c7d5-497f-b165-484bd945318c\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.082405 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-trusted-ca-bundle\") pod \"aa22c984-c7d5-497f-b165-484bd945318c\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.082487 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzmm7\" (UniqueName: \"kubernetes.io/projected/aa22c984-c7d5-497f-b165-484bd945318c-kube-api-access-wzmm7\") pod \"aa22c984-c7d5-497f-b165-484bd945318c\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.082548 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa22c984-c7d5-497f-b165-484bd945318c-console-oauth-config\") pod \"aa22c984-c7d5-497f-b165-484bd945318c\" (UID: \"aa22c984-c7d5-497f-b165-484bd945318c\") " Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.083011 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "aa22c984-c7d5-497f-b165-484bd945318c" (UID: "aa22c984-c7d5-497f-b165-484bd945318c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.084348 4776 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.084659 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-console-config" (OuterVolumeSpecName: "console-config") pod "aa22c984-c7d5-497f-b165-484bd945318c" (UID: "aa22c984-c7d5-497f-b165-484bd945318c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.084918 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-service-ca" (OuterVolumeSpecName: "service-ca") pod "aa22c984-c7d5-497f-b165-484bd945318c" (UID: "aa22c984-c7d5-497f-b165-484bd945318c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.085252 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "aa22c984-c7d5-497f-b165-484bd945318c" (UID: "aa22c984-c7d5-497f-b165-484bd945318c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.091079 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa22c984-c7d5-497f-b165-484bd945318c-kube-api-access-wzmm7" (OuterVolumeSpecName: "kube-api-access-wzmm7") pod "aa22c984-c7d5-497f-b165-484bd945318c" (UID: "aa22c984-c7d5-497f-b165-484bd945318c"). InnerVolumeSpecName "kube-api-access-wzmm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.102853 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa22c984-c7d5-497f-b165-484bd945318c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "aa22c984-c7d5-497f-b165-484bd945318c" (UID: "aa22c984-c7d5-497f-b165-484bd945318c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.103399 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa22c984-c7d5-497f-b165-484bd945318c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "aa22c984-c7d5-497f-b165-484bd945318c" (UID: "aa22c984-c7d5-497f-b165-484bd945318c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.185442 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-service-ca\") on node \"crc\" DevicePath \"\"" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.185481 4776 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-console-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.185492 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa22c984-c7d5-497f-b165-484bd945318c-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.185501 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzmm7\" (UniqueName: \"kubernetes.io/projected/aa22c984-c7d5-497f-b165-484bd945318c-kube-api-access-wzmm7\") on node \"crc\" DevicePath \"\"" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.185510 4776 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa22c984-c7d5-497f-b165-484bd945318c-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.185518 4776 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa22c984-c7d5-497f-b165-484bd945318c-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.638785 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-796dfb4c97-zckps_aa22c984-c7d5-497f-b165-484bd945318c/console/0.log" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.638831 4776 generic.go:334] "Generic (PLEG): container finished" podID="aa22c984-c7d5-497f-b165-484bd945318c" containerID="f1853047fafe7bea95051435c12647845ee744a4c7c2672f21a97874c721b856" exitCode=2 Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.638862 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-796dfb4c97-zckps" event={"ID":"aa22c984-c7d5-497f-b165-484bd945318c","Type":"ContainerDied","Data":"f1853047fafe7bea95051435c12647845ee744a4c7c2672f21a97874c721b856"} Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.638887 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-796dfb4c97-zckps" event={"ID":"aa22c984-c7d5-497f-b165-484bd945318c","Type":"ContainerDied","Data":"d0853a1efba37437854e67d0918eb49f6d3ab215a6a2d31e375fbace68e731a9"} Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.638902 4776 scope.go:117] "RemoveContainer" containerID="f1853047fafe7bea95051435c12647845ee744a4c7c2672f21a97874c721b856" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.639005 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-796dfb4c97-zckps" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.658419 4776 scope.go:117] "RemoveContainer" containerID="f1853047fafe7bea95051435c12647845ee744a4c7c2672f21a97874c721b856" Dec 08 09:17:13 crc kubenswrapper[4776]: E1208 09:17:13.658921 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1853047fafe7bea95051435c12647845ee744a4c7c2672f21a97874c721b856\": container with ID starting with f1853047fafe7bea95051435c12647845ee744a4c7c2672f21a97874c721b856 not found: ID does not exist" containerID="f1853047fafe7bea95051435c12647845ee744a4c7c2672f21a97874c721b856" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.658981 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1853047fafe7bea95051435c12647845ee744a4c7c2672f21a97874c721b856"} err="failed to get container status \"f1853047fafe7bea95051435c12647845ee744a4c7c2672f21a97874c721b856\": rpc error: code = NotFound desc = could not find container \"f1853047fafe7bea95051435c12647845ee744a4c7c2672f21a97874c721b856\": container with ID starting with f1853047fafe7bea95051435c12647845ee744a4c7c2672f21a97874c721b856 not found: ID does not exist" Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.675965 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-796dfb4c97-zckps"] Dec 08 09:17:13 crc kubenswrapper[4776]: I1208 09:17:13.684021 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-796dfb4c97-zckps"] Dec 08 09:17:14 crc kubenswrapper[4776]: I1208 09:17:14.353199 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa22c984-c7d5-497f-b165-484bd945318c" path="/var/lib/kubelet/pods/aa22c984-c7d5-497f-b165-484bd945318c/volumes" Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.039270 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl"] Dec 08 09:17:29 crc kubenswrapper[4776]: E1208 09:17:29.040090 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa22c984-c7d5-497f-b165-484bd945318c" containerName="console" Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.040106 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa22c984-c7d5-497f-b165-484bd945318c" containerName="console" Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.040540 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa22c984-c7d5-497f-b165-484bd945318c" containerName="console" Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.042749 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.047406 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.060216 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl"] Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.243113 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtmzr\" (UniqueName: \"kubernetes.io/projected/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-kube-api-access-jtmzr\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl\" (UID: \"ecb04392-c8da-4ee9-ae5f-aa7212b963e9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.243208 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl\" (UID: \"ecb04392-c8da-4ee9-ae5f-aa7212b963e9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.243261 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl\" (UID: \"ecb04392-c8da-4ee9-ae5f-aa7212b963e9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.345456 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl\" (UID: \"ecb04392-c8da-4ee9-ae5f-aa7212b963e9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.345585 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl\" (UID: \"ecb04392-c8da-4ee9-ae5f-aa7212b963e9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.345718 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtmzr\" (UniqueName: \"kubernetes.io/projected/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-kube-api-access-jtmzr\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl\" (UID: \"ecb04392-c8da-4ee9-ae5f-aa7212b963e9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.346075 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl\" (UID: \"ecb04392-c8da-4ee9-ae5f-aa7212b963e9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.346305 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl\" (UID: \"ecb04392-c8da-4ee9-ae5f-aa7212b963e9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.370347 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtmzr\" (UniqueName: \"kubernetes.io/projected/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-kube-api-access-jtmzr\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl\" (UID: \"ecb04392-c8da-4ee9-ae5f-aa7212b963e9\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.370778 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" Dec 08 09:17:29 crc kubenswrapper[4776]: I1208 09:17:29.850463 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl"] Dec 08 09:17:30 crc kubenswrapper[4776]: I1208 09:17:30.778381 4776 generic.go:334] "Generic (PLEG): container finished" podID="ecb04392-c8da-4ee9-ae5f-aa7212b963e9" containerID="61b27cc21ecd2fc8d3b34522df2ef390b5ed5486219ea1c4035b82f51d2318f9" exitCode=0 Dec 08 09:17:30 crc kubenswrapper[4776]: I1208 09:17:30.778609 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" event={"ID":"ecb04392-c8da-4ee9-ae5f-aa7212b963e9","Type":"ContainerDied","Data":"61b27cc21ecd2fc8d3b34522df2ef390b5ed5486219ea1c4035b82f51d2318f9"} Dec 08 09:17:30 crc kubenswrapper[4776]: I1208 09:17:30.779292 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" event={"ID":"ecb04392-c8da-4ee9-ae5f-aa7212b963e9","Type":"ContainerStarted","Data":"31a8e57f62812e2ab971d2b6d9048e8df58a5dafd20f5f485d9f3567097fd4de"} Dec 08 09:17:32 crc kubenswrapper[4776]: I1208 09:17:32.794569 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" event={"ID":"ecb04392-c8da-4ee9-ae5f-aa7212b963e9","Type":"ContainerDied","Data":"ba0e4dd78f1ed2382654be665176479dbff22c0bcd0d16cae0fc05797f3f455c"} Dec 08 09:17:32 crc kubenswrapper[4776]: I1208 09:17:32.794436 4776 generic.go:334] "Generic (PLEG): container finished" podID="ecb04392-c8da-4ee9-ae5f-aa7212b963e9" containerID="ba0e4dd78f1ed2382654be665176479dbff22c0bcd0d16cae0fc05797f3f455c" exitCode=0 Dec 08 09:17:33 crc kubenswrapper[4776]: I1208 09:17:33.808825 4776 generic.go:334] "Generic (PLEG): container finished" podID="ecb04392-c8da-4ee9-ae5f-aa7212b963e9" containerID="e76013c895428e7a042d0bf857a48d30b759ea9ec12a0c32f1edd80db02e6b25" exitCode=0 Dec 08 09:17:33 crc kubenswrapper[4776]: I1208 09:17:33.808893 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" event={"ID":"ecb04392-c8da-4ee9-ae5f-aa7212b963e9","Type":"ContainerDied","Data":"e76013c895428e7a042d0bf857a48d30b759ea9ec12a0c32f1edd80db02e6b25"} Dec 08 09:17:35 crc kubenswrapper[4776]: I1208 09:17:35.149668 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" Dec 08 09:17:35 crc kubenswrapper[4776]: I1208 09:17:35.352898 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-bundle\") pod \"ecb04392-c8da-4ee9-ae5f-aa7212b963e9\" (UID: \"ecb04392-c8da-4ee9-ae5f-aa7212b963e9\") " Dec 08 09:17:35 crc kubenswrapper[4776]: I1208 09:17:35.352982 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtmzr\" (UniqueName: \"kubernetes.io/projected/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-kube-api-access-jtmzr\") pod \"ecb04392-c8da-4ee9-ae5f-aa7212b963e9\" (UID: \"ecb04392-c8da-4ee9-ae5f-aa7212b963e9\") " Dec 08 09:17:35 crc kubenswrapper[4776]: I1208 09:17:35.353522 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-util\") pod \"ecb04392-c8da-4ee9-ae5f-aa7212b963e9\" (UID: \"ecb04392-c8da-4ee9-ae5f-aa7212b963e9\") " Dec 08 09:17:35 crc kubenswrapper[4776]: I1208 09:17:35.354793 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-bundle" (OuterVolumeSpecName: "bundle") pod "ecb04392-c8da-4ee9-ae5f-aa7212b963e9" (UID: "ecb04392-c8da-4ee9-ae5f-aa7212b963e9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:17:35 crc kubenswrapper[4776]: I1208 09:17:35.363324 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-kube-api-access-jtmzr" (OuterVolumeSpecName: "kube-api-access-jtmzr") pod "ecb04392-c8da-4ee9-ae5f-aa7212b963e9" (UID: "ecb04392-c8da-4ee9-ae5f-aa7212b963e9"). InnerVolumeSpecName "kube-api-access-jtmzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:17:35 crc kubenswrapper[4776]: I1208 09:17:35.455577 4776 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:17:35 crc kubenswrapper[4776]: I1208 09:17:35.455604 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtmzr\" (UniqueName: \"kubernetes.io/projected/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-kube-api-access-jtmzr\") on node \"crc\" DevicePath \"\"" Dec 08 09:17:35 crc kubenswrapper[4776]: I1208 09:17:35.752150 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-util" (OuterVolumeSpecName: "util") pod "ecb04392-c8da-4ee9-ae5f-aa7212b963e9" (UID: "ecb04392-c8da-4ee9-ae5f-aa7212b963e9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:17:35 crc kubenswrapper[4776]: I1208 09:17:35.760751 4776 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecb04392-c8da-4ee9-ae5f-aa7212b963e9-util\") on node \"crc\" DevicePath \"\"" Dec 08 09:17:35 crc kubenswrapper[4776]: I1208 09:17:35.823358 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" event={"ID":"ecb04392-c8da-4ee9-ae5f-aa7212b963e9","Type":"ContainerDied","Data":"31a8e57f62812e2ab971d2b6d9048e8df58a5dafd20f5f485d9f3567097fd4de"} Dec 08 09:17:35 crc kubenswrapper[4776]: I1208 09:17:35.823403 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31a8e57f62812e2ab971d2b6d9048e8df58a5dafd20f5f485d9f3567097fd4de" Dec 08 09:17:35 crc kubenswrapper[4776]: I1208 09:17:35.823423 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl" Dec 08 09:17:41 crc kubenswrapper[4776]: I1208 09:17:41.398705 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:17:41 crc kubenswrapper[4776]: I1208 09:17:41.399505 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.522153 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76"] Dec 08 09:17:47 crc kubenswrapper[4776]: E1208 09:17:47.522834 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb04392-c8da-4ee9-ae5f-aa7212b963e9" containerName="pull" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.522851 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb04392-c8da-4ee9-ae5f-aa7212b963e9" containerName="pull" Dec 08 09:17:47 crc kubenswrapper[4776]: E1208 09:17:47.522869 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb04392-c8da-4ee9-ae5f-aa7212b963e9" containerName="util" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.522878 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb04392-c8da-4ee9-ae5f-aa7212b963e9" containerName="util" Dec 08 09:17:47 crc kubenswrapper[4776]: E1208 09:17:47.522905 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb04392-c8da-4ee9-ae5f-aa7212b963e9" containerName="extract" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.522913 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb04392-c8da-4ee9-ae5f-aa7212b963e9" containerName="extract" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.523072 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecb04392-c8da-4ee9-ae5f-aa7212b963e9" containerName="extract" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.523769 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.526270 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.527259 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.528912 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.529083 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-kf86s" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.533013 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.560663 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76"] Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.647202 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/579d6f99-9917-455f-b0cf-350c24bae128-webhook-cert\") pod \"metallb-operator-controller-manager-7567df7f9b-ctl76\" (UID: \"579d6f99-9917-455f-b0cf-350c24bae128\") " pod="metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.647290 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2kp5\" (UniqueName: \"kubernetes.io/projected/579d6f99-9917-455f-b0cf-350c24bae128-kube-api-access-k2kp5\") pod \"metallb-operator-controller-manager-7567df7f9b-ctl76\" (UID: \"579d6f99-9917-455f-b0cf-350c24bae128\") " pod="metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.647382 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/579d6f99-9917-455f-b0cf-350c24bae128-apiservice-cert\") pod \"metallb-operator-controller-manager-7567df7f9b-ctl76\" (UID: \"579d6f99-9917-455f-b0cf-350c24bae128\") " pod="metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.748869 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/579d6f99-9917-455f-b0cf-350c24bae128-webhook-cert\") pod \"metallb-operator-controller-manager-7567df7f9b-ctl76\" (UID: \"579d6f99-9917-455f-b0cf-350c24bae128\") " pod="metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.749129 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2kp5\" (UniqueName: \"kubernetes.io/projected/579d6f99-9917-455f-b0cf-350c24bae128-kube-api-access-k2kp5\") pod \"metallb-operator-controller-manager-7567df7f9b-ctl76\" (UID: \"579d6f99-9917-455f-b0cf-350c24bae128\") " pod="metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.749250 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/579d6f99-9917-455f-b0cf-350c24bae128-apiservice-cert\") pod \"metallb-operator-controller-manager-7567df7f9b-ctl76\" (UID: \"579d6f99-9917-455f-b0cf-350c24bae128\") " pod="metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.754902 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/579d6f99-9917-455f-b0cf-350c24bae128-apiservice-cert\") pod \"metallb-operator-controller-manager-7567df7f9b-ctl76\" (UID: \"579d6f99-9917-455f-b0cf-350c24bae128\") " pod="metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.756468 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/579d6f99-9917-455f-b0cf-350c24bae128-webhook-cert\") pod \"metallb-operator-controller-manager-7567df7f9b-ctl76\" (UID: \"579d6f99-9917-455f-b0cf-350c24bae128\") " pod="metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.769085 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2kp5\" (UniqueName: \"kubernetes.io/projected/579d6f99-9917-455f-b0cf-350c24bae128-kube-api-access-k2kp5\") pod \"metallb-operator-controller-manager-7567df7f9b-ctl76\" (UID: \"579d6f99-9917-455f-b0cf-350c24bae128\") " pod="metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.789706 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw"] Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.790697 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.792439 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-6dljd" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.792753 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.792764 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.844032 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.857447 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw"] Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.953816 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhfkq\" (UniqueName: \"kubernetes.io/projected/ea83e974-be12-4152-bd97-0d699c0e13b2-kube-api-access-bhfkq\") pod \"metallb-operator-webhook-server-7f595f4d5-92ttw\" (UID: \"ea83e974-be12-4152-bd97-0d699c0e13b2\") " pod="metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.955720 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea83e974-be12-4152-bd97-0d699c0e13b2-apiservice-cert\") pod \"metallb-operator-webhook-server-7f595f4d5-92ttw\" (UID: \"ea83e974-be12-4152-bd97-0d699c0e13b2\") " pod="metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw" Dec 08 09:17:47 crc kubenswrapper[4776]: I1208 09:17:47.955844 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea83e974-be12-4152-bd97-0d699c0e13b2-webhook-cert\") pod \"metallb-operator-webhook-server-7f595f4d5-92ttw\" (UID: \"ea83e974-be12-4152-bd97-0d699c0e13b2\") " pod="metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw" Dec 08 09:17:48 crc kubenswrapper[4776]: I1208 09:17:48.057265 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea83e974-be12-4152-bd97-0d699c0e13b2-apiservice-cert\") pod \"metallb-operator-webhook-server-7f595f4d5-92ttw\" (UID: \"ea83e974-be12-4152-bd97-0d699c0e13b2\") " pod="metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw" Dec 08 09:17:48 crc kubenswrapper[4776]: I1208 09:17:48.057331 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea83e974-be12-4152-bd97-0d699c0e13b2-webhook-cert\") pod \"metallb-operator-webhook-server-7f595f4d5-92ttw\" (UID: \"ea83e974-be12-4152-bd97-0d699c0e13b2\") " pod="metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw" Dec 08 09:17:48 crc kubenswrapper[4776]: I1208 09:17:48.057403 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhfkq\" (UniqueName: \"kubernetes.io/projected/ea83e974-be12-4152-bd97-0d699c0e13b2-kube-api-access-bhfkq\") pod \"metallb-operator-webhook-server-7f595f4d5-92ttw\" (UID: \"ea83e974-be12-4152-bd97-0d699c0e13b2\") " pod="metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw" Dec 08 09:17:48 crc kubenswrapper[4776]: I1208 09:17:48.064887 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea83e974-be12-4152-bd97-0d699c0e13b2-apiservice-cert\") pod \"metallb-operator-webhook-server-7f595f4d5-92ttw\" (UID: \"ea83e974-be12-4152-bd97-0d699c0e13b2\") " pod="metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw" Dec 08 09:17:48 crc kubenswrapper[4776]: I1208 09:17:48.081191 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhfkq\" (UniqueName: \"kubernetes.io/projected/ea83e974-be12-4152-bd97-0d699c0e13b2-kube-api-access-bhfkq\") pod \"metallb-operator-webhook-server-7f595f4d5-92ttw\" (UID: \"ea83e974-be12-4152-bd97-0d699c0e13b2\") " pod="metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw" Dec 08 09:17:48 crc kubenswrapper[4776]: I1208 09:17:48.082194 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea83e974-be12-4152-bd97-0d699c0e13b2-webhook-cert\") pod \"metallb-operator-webhook-server-7f595f4d5-92ttw\" (UID: \"ea83e974-be12-4152-bd97-0d699c0e13b2\") " pod="metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw" Dec 08 09:17:48 crc kubenswrapper[4776]: I1208 09:17:48.121508 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw" Dec 08 09:17:48 crc kubenswrapper[4776]: I1208 09:17:48.320446 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76"] Dec 08 09:17:48 crc kubenswrapper[4776]: I1208 09:17:48.575240 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw"] Dec 08 09:17:48 crc kubenswrapper[4776]: I1208 09:17:48.925370 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76" event={"ID":"579d6f99-9917-455f-b0cf-350c24bae128","Type":"ContainerStarted","Data":"3eea73db86b204f80153a8436d9fa21bd5e1ca282f4df50a2f7242641bef7540"} Dec 08 09:17:48 crc kubenswrapper[4776]: I1208 09:17:48.926544 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw" event={"ID":"ea83e974-be12-4152-bd97-0d699c0e13b2","Type":"ContainerStarted","Data":"36cf96cc152cc8cdeb2d351ecccafa55795b1e0a0e6d821b260e42f6e2709e28"} Dec 08 09:17:51 crc kubenswrapper[4776]: I1208 09:17:51.951187 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76" event={"ID":"579d6f99-9917-455f-b0cf-350c24bae128","Type":"ContainerStarted","Data":"d67f04a45774bd12c4cd630602ff94de3a2b11f856e97237b496bc7561328c8f"} Dec 08 09:17:51 crc kubenswrapper[4776]: I1208 09:17:51.951740 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76" Dec 08 09:17:51 crc kubenswrapper[4776]: I1208 09:17:51.977149 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76" podStartSLOduration=2.207195757 podStartE2EDuration="4.977125396s" podCreationTimestamp="2025-12-08 09:17:47 +0000 UTC" firstStartedPulling="2025-12-08 09:17:48.330259009 +0000 UTC m=+1144.593484031" lastFinishedPulling="2025-12-08 09:17:51.100188648 +0000 UTC m=+1147.363413670" observedRunningTime="2025-12-08 09:17:51.975609945 +0000 UTC m=+1148.238834967" watchObservedRunningTime="2025-12-08 09:17:51.977125396 +0000 UTC m=+1148.240350418" Dec 08 09:17:53 crc kubenswrapper[4776]: I1208 09:17:53.967633 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw" event={"ID":"ea83e974-be12-4152-bd97-0d699c0e13b2","Type":"ContainerStarted","Data":"f18a9dd1d4279d3c050b4209aea317c72a3fbcf24af4806fb8bf8a6c7c7ecba5"} Dec 08 09:17:53 crc kubenswrapper[4776]: I1208 09:17:53.968035 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw" Dec 08 09:17:53 crc kubenswrapper[4776]: I1208 09:17:53.998778 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw" podStartSLOduration=2.696896459 podStartE2EDuration="6.99874731s" podCreationTimestamp="2025-12-08 09:17:47 +0000 UTC" firstStartedPulling="2025-12-08 09:17:48.577757491 +0000 UTC m=+1144.840982513" lastFinishedPulling="2025-12-08 09:17:52.879608342 +0000 UTC m=+1149.142833364" observedRunningTime="2025-12-08 09:17:53.988262908 +0000 UTC m=+1150.251487980" watchObservedRunningTime="2025-12-08 09:17:53.99874731 +0000 UTC m=+1150.261972352" Dec 08 09:18:08 crc kubenswrapper[4776]: I1208 09:18:08.127209 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7f595f4d5-92ttw" Dec 08 09:18:11 crc kubenswrapper[4776]: I1208 09:18:11.398896 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:18:11 crc kubenswrapper[4776]: I1208 09:18:11.399275 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:18:11 crc kubenswrapper[4776]: I1208 09:18:11.399316 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 09:18:11 crc kubenswrapper[4776]: I1208 09:18:11.399945 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"409acf0371b6644dc04fbcc1653de1b5f75319f5fcc98f856ec232671ed68b71"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 09:18:11 crc kubenswrapper[4776]: I1208 09:18:11.399992 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://409acf0371b6644dc04fbcc1653de1b5f75319f5fcc98f856ec232671ed68b71" gracePeriod=600 Dec 08 09:18:12 crc kubenswrapper[4776]: I1208 09:18:12.116595 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="409acf0371b6644dc04fbcc1653de1b5f75319f5fcc98f856ec232671ed68b71" exitCode=0 Dec 08 09:18:12 crc kubenswrapper[4776]: I1208 09:18:12.116880 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"409acf0371b6644dc04fbcc1653de1b5f75319f5fcc98f856ec232671ed68b71"} Dec 08 09:18:12 crc kubenswrapper[4776]: I1208 09:18:12.116917 4776 scope.go:117] "RemoveContainer" containerID="be636680726361907bd5f0d2d58d00dbbd0c77d0144025e4fa0b6101666966a8" Dec 08 09:18:13 crc kubenswrapper[4776]: I1208 09:18:13.126513 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"6a5351febb0de8fddebf4555b73007dffb77eb52f317fae03ed23b485a212557"} Dec 08 09:18:27 crc kubenswrapper[4776]: I1208 09:18:27.847016 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7567df7f9b-ctl76" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.569605 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-ph98m"] Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.572801 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.575318 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-24q8z" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.576109 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.576478 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.577556 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-c68cp"] Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.578795 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-c68cp" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.582530 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.585972 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-c68cp"] Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.635116 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/12fb1453-ed3a-4c22-b33b-8c8e5402de93-reloader\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.635228 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns75d\" (UniqueName: \"kubernetes.io/projected/640e92fd-0236-408e-95ba-a5aacfe784d4-kube-api-access-ns75d\") pod \"frr-k8s-webhook-server-7fcb986d4-c68cp\" (UID: \"640e92fd-0236-408e-95ba-a5aacfe784d4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-c68cp" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.635254 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/12fb1453-ed3a-4c22-b33b-8c8e5402de93-frr-sockets\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.635275 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/12fb1453-ed3a-4c22-b33b-8c8e5402de93-metrics\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.635362 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/640e92fd-0236-408e-95ba-a5aacfe784d4-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-c68cp\" (UID: \"640e92fd-0236-408e-95ba-a5aacfe784d4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-c68cp" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.635424 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/12fb1453-ed3a-4c22-b33b-8c8e5402de93-frr-startup\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.635449 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/12fb1453-ed3a-4c22-b33b-8c8e5402de93-frr-conf\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.635467 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx6xc\" (UniqueName: \"kubernetes.io/projected/12fb1453-ed3a-4c22-b33b-8c8e5402de93-kube-api-access-nx6xc\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.635495 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12fb1453-ed3a-4c22-b33b-8c8e5402de93-metrics-certs\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.658568 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-gt7wq"] Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.659821 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gt7wq" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.662743 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-vbndw" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.665620 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.674472 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-gfgfc"] Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.677702 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-gfgfc" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.678048 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.678101 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.679683 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.700054 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-gfgfc"] Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.741441 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/12fb1453-ed3a-4c22-b33b-8c8e5402de93-frr-startup\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.741496 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/12fb1453-ed3a-4c22-b33b-8c8e5402de93-frr-conf\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.741516 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx6xc\" (UniqueName: \"kubernetes.io/projected/12fb1453-ed3a-4c22-b33b-8c8e5402de93-kube-api-access-nx6xc\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.741541 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f696bcbd-7230-43d3-b09e-645d489eacf3-cert\") pod \"controller-f8648f98b-gfgfc\" (UID: \"f696bcbd-7230-43d3-b09e-645d489eacf3\") " pod="metallb-system/controller-f8648f98b-gfgfc" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.741560 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12fb1453-ed3a-4c22-b33b-8c8e5402de93-metrics-certs\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.741582 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-metrics-certs\") pod \"speaker-gt7wq\" (UID: \"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6\") " pod="metallb-system/speaker-gt7wq" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.741641 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f696bcbd-7230-43d3-b09e-645d489eacf3-metrics-certs\") pod \"controller-f8648f98b-gfgfc\" (UID: \"f696bcbd-7230-43d3-b09e-645d489eacf3\") " pod="metallb-system/controller-f8648f98b-gfgfc" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.741668 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/12fb1453-ed3a-4c22-b33b-8c8e5402de93-reloader\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.741707 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khqt5\" (UniqueName: \"kubernetes.io/projected/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-kube-api-access-khqt5\") pod \"speaker-gt7wq\" (UID: \"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6\") " pod="metallb-system/speaker-gt7wq" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.741733 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns75d\" (UniqueName: \"kubernetes.io/projected/640e92fd-0236-408e-95ba-a5aacfe784d4-kube-api-access-ns75d\") pod \"frr-k8s-webhook-server-7fcb986d4-c68cp\" (UID: \"640e92fd-0236-408e-95ba-a5aacfe784d4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-c68cp" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.741758 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-memberlist\") pod \"speaker-gt7wq\" (UID: \"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6\") " pod="metallb-system/speaker-gt7wq" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.741781 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/12fb1453-ed3a-4c22-b33b-8c8e5402de93-frr-sockets\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.741803 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/12fb1453-ed3a-4c22-b33b-8c8e5402de93-metrics\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.741845 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-metallb-excludel2\") pod \"speaker-gt7wq\" (UID: \"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6\") " pod="metallb-system/speaker-gt7wq" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.741868 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tls5x\" (UniqueName: \"kubernetes.io/projected/f696bcbd-7230-43d3-b09e-645d489eacf3-kube-api-access-tls5x\") pod \"controller-f8648f98b-gfgfc\" (UID: \"f696bcbd-7230-43d3-b09e-645d489eacf3\") " pod="metallb-system/controller-f8648f98b-gfgfc" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.741898 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/640e92fd-0236-408e-95ba-a5aacfe784d4-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-c68cp\" (UID: \"640e92fd-0236-408e-95ba-a5aacfe784d4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-c68cp" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.743099 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/12fb1453-ed3a-4c22-b33b-8c8e5402de93-reloader\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.743138 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/12fb1453-ed3a-4c22-b33b-8c8e5402de93-frr-sockets\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.743526 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/12fb1453-ed3a-4c22-b33b-8c8e5402de93-metrics\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.744534 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/12fb1453-ed3a-4c22-b33b-8c8e5402de93-frr-conf\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.745031 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/12fb1453-ed3a-4c22-b33b-8c8e5402de93-frr-startup\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.748952 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12fb1453-ed3a-4c22-b33b-8c8e5402de93-metrics-certs\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.760295 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/640e92fd-0236-408e-95ba-a5aacfe784d4-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-c68cp\" (UID: \"640e92fd-0236-408e-95ba-a5aacfe784d4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-c68cp" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.761241 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns75d\" (UniqueName: \"kubernetes.io/projected/640e92fd-0236-408e-95ba-a5aacfe784d4-kube-api-access-ns75d\") pod \"frr-k8s-webhook-server-7fcb986d4-c68cp\" (UID: \"640e92fd-0236-408e-95ba-a5aacfe784d4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-c68cp" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.766269 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx6xc\" (UniqueName: \"kubernetes.io/projected/12fb1453-ed3a-4c22-b33b-8c8e5402de93-kube-api-access-nx6xc\") pod \"frr-k8s-ph98m\" (UID: \"12fb1453-ed3a-4c22-b33b-8c8e5402de93\") " pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.843033 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f696bcbd-7230-43d3-b09e-645d489eacf3-cert\") pod \"controller-f8648f98b-gfgfc\" (UID: \"f696bcbd-7230-43d3-b09e-645d489eacf3\") " pod="metallb-system/controller-f8648f98b-gfgfc" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.843098 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-metrics-certs\") pod \"speaker-gt7wq\" (UID: \"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6\") " pod="metallb-system/speaker-gt7wq" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.843163 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f696bcbd-7230-43d3-b09e-645d489eacf3-metrics-certs\") pod \"controller-f8648f98b-gfgfc\" (UID: \"f696bcbd-7230-43d3-b09e-645d489eacf3\") " pod="metallb-system/controller-f8648f98b-gfgfc" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.843224 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khqt5\" (UniqueName: \"kubernetes.io/projected/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-kube-api-access-khqt5\") pod \"speaker-gt7wq\" (UID: \"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6\") " pod="metallb-system/speaker-gt7wq" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.843248 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-memberlist\") pod \"speaker-gt7wq\" (UID: \"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6\") " pod="metallb-system/speaker-gt7wq" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.843287 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-metallb-excludel2\") pod \"speaker-gt7wq\" (UID: \"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6\") " pod="metallb-system/speaker-gt7wq" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.843311 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tls5x\" (UniqueName: \"kubernetes.io/projected/f696bcbd-7230-43d3-b09e-645d489eacf3-kube-api-access-tls5x\") pod \"controller-f8648f98b-gfgfc\" (UID: \"f696bcbd-7230-43d3-b09e-645d489eacf3\") " pod="metallb-system/controller-f8648f98b-gfgfc" Dec 08 09:18:28 crc kubenswrapper[4776]: E1208 09:18:28.844930 4776 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 08 09:18:28 crc kubenswrapper[4776]: E1208 09:18:28.845097 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-memberlist podName:4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6 nodeName:}" failed. No retries permitted until 2025-12-08 09:18:29.345071804 +0000 UTC m=+1185.608296826 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-memberlist") pod "speaker-gt7wq" (UID: "4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6") : secret "metallb-memberlist" not found Dec 08 09:18:28 crc kubenswrapper[4776]: E1208 09:18:28.844952 4776 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 08 09:18:28 crc kubenswrapper[4776]: E1208 09:18:28.845344 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f696bcbd-7230-43d3-b09e-645d489eacf3-metrics-certs podName:f696bcbd-7230-43d3-b09e-645d489eacf3 nodeName:}" failed. No retries permitted until 2025-12-08 09:18:29.34533201 +0000 UTC m=+1185.608557032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f696bcbd-7230-43d3-b09e-645d489eacf3-metrics-certs") pod "controller-f8648f98b-gfgfc" (UID: "f696bcbd-7230-43d3-b09e-645d489eacf3") : secret "controller-certs-secret" not found Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.845790 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-metallb-excludel2\") pod \"speaker-gt7wq\" (UID: \"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6\") " pod="metallb-system/speaker-gt7wq" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.849786 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f696bcbd-7230-43d3-b09e-645d489eacf3-cert\") pod \"controller-f8648f98b-gfgfc\" (UID: \"f696bcbd-7230-43d3-b09e-645d489eacf3\") " pod="metallb-system/controller-f8648f98b-gfgfc" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.850787 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-metrics-certs\") pod \"speaker-gt7wq\" (UID: \"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6\") " pod="metallb-system/speaker-gt7wq" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.865283 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khqt5\" (UniqueName: \"kubernetes.io/projected/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-kube-api-access-khqt5\") pod \"speaker-gt7wq\" (UID: \"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6\") " pod="metallb-system/speaker-gt7wq" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.868374 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tls5x\" (UniqueName: \"kubernetes.io/projected/f696bcbd-7230-43d3-b09e-645d489eacf3-kube-api-access-tls5x\") pod \"controller-f8648f98b-gfgfc\" (UID: \"f696bcbd-7230-43d3-b09e-645d489eacf3\") " pod="metallb-system/controller-f8648f98b-gfgfc" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.907609 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:28 crc kubenswrapper[4776]: I1208 09:18:28.908813 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-c68cp" Dec 08 09:18:29 crc kubenswrapper[4776]: I1208 09:18:29.243561 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ph98m" event={"ID":"12fb1453-ed3a-4c22-b33b-8c8e5402de93","Type":"ContainerStarted","Data":"de7cd451241eb9af2f3fd154400f2c872e8cbdee1a6ef18104bc1ed85091e509"} Dec 08 09:18:29 crc kubenswrapper[4776]: W1208 09:18:29.349320 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod640e92fd_0236_408e_95ba_a5aacfe784d4.slice/crio-d352804f685d928234e4e28fe74bc40390144ab0b7997dc114bcd7bf99311f4b WatchSource:0}: Error finding container d352804f685d928234e4e28fe74bc40390144ab0b7997dc114bcd7bf99311f4b: Status 404 returned error can't find the container with id d352804f685d928234e4e28fe74bc40390144ab0b7997dc114bcd7bf99311f4b Dec 08 09:18:29 crc kubenswrapper[4776]: I1208 09:18:29.353801 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-c68cp"] Dec 08 09:18:29 crc kubenswrapper[4776]: I1208 09:18:29.355149 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f696bcbd-7230-43d3-b09e-645d489eacf3-metrics-certs\") pod \"controller-f8648f98b-gfgfc\" (UID: \"f696bcbd-7230-43d3-b09e-645d489eacf3\") " pod="metallb-system/controller-f8648f98b-gfgfc" Dec 08 09:18:29 crc kubenswrapper[4776]: I1208 09:18:29.355236 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-memberlist\") pod \"speaker-gt7wq\" (UID: \"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6\") " pod="metallb-system/speaker-gt7wq" Dec 08 09:18:29 crc kubenswrapper[4776]: E1208 09:18:29.355446 4776 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 08 09:18:29 crc kubenswrapper[4776]: E1208 09:18:29.355529 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-memberlist podName:4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6 nodeName:}" failed. No retries permitted until 2025-12-08 09:18:30.355510055 +0000 UTC m=+1186.618735077 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-memberlist") pod "speaker-gt7wq" (UID: "4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6") : secret "metallb-memberlist" not found Dec 08 09:18:29 crc kubenswrapper[4776]: I1208 09:18:29.363671 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f696bcbd-7230-43d3-b09e-645d489eacf3-metrics-certs\") pod \"controller-f8648f98b-gfgfc\" (UID: \"f696bcbd-7230-43d3-b09e-645d489eacf3\") " pod="metallb-system/controller-f8648f98b-gfgfc" Dec 08 09:18:29 crc kubenswrapper[4776]: I1208 09:18:29.594373 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-gfgfc" Dec 08 09:18:29 crc kubenswrapper[4776]: I1208 09:18:29.994202 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-gfgfc"] Dec 08 09:18:30 crc kubenswrapper[4776]: W1208 09:18:30.000325 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf696bcbd_7230_43d3_b09e_645d489eacf3.slice/crio-ad006f9965ae4b5a300ef387b313b7b07a2da6ca39e220b77fe7cc23483359bc WatchSource:0}: Error finding container ad006f9965ae4b5a300ef387b313b7b07a2da6ca39e220b77fe7cc23483359bc: Status 404 returned error can't find the container with id ad006f9965ae4b5a300ef387b313b7b07a2da6ca39e220b77fe7cc23483359bc Dec 08 09:18:30 crc kubenswrapper[4776]: I1208 09:18:30.260806 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-gfgfc" event={"ID":"f696bcbd-7230-43d3-b09e-645d489eacf3","Type":"ContainerStarted","Data":"e90a27d30f6e08da3caf8fd35270ef2f5196f79b2a73e71a044975e5b7980c6e"} Dec 08 09:18:30 crc kubenswrapper[4776]: I1208 09:18:30.261341 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-gfgfc" event={"ID":"f696bcbd-7230-43d3-b09e-645d489eacf3","Type":"ContainerStarted","Data":"ad006f9965ae4b5a300ef387b313b7b07a2da6ca39e220b77fe7cc23483359bc"} Dec 08 09:18:30 crc kubenswrapper[4776]: I1208 09:18:30.268061 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-c68cp" event={"ID":"640e92fd-0236-408e-95ba-a5aacfe784d4","Type":"ContainerStarted","Data":"d352804f685d928234e4e28fe74bc40390144ab0b7997dc114bcd7bf99311f4b"} Dec 08 09:18:30 crc kubenswrapper[4776]: I1208 09:18:30.372041 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-memberlist\") pod \"speaker-gt7wq\" (UID: \"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6\") " pod="metallb-system/speaker-gt7wq" Dec 08 09:18:30 crc kubenswrapper[4776]: I1208 09:18:30.381131 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6-memberlist\") pod \"speaker-gt7wq\" (UID: \"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6\") " pod="metallb-system/speaker-gt7wq" Dec 08 09:18:30 crc kubenswrapper[4776]: I1208 09:18:30.473194 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gt7wq" Dec 08 09:18:31 crc kubenswrapper[4776]: I1208 09:18:31.292374 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gt7wq" event={"ID":"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6","Type":"ContainerStarted","Data":"816a4f1a1f56b71192dda33d587e778ce39481410ec3647c6cbbbad74958203e"} Dec 08 09:18:31 crc kubenswrapper[4776]: I1208 09:18:31.292785 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gt7wq" event={"ID":"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6","Type":"ContainerStarted","Data":"fe0c3547f7d047e5b44b7b62a33991d33520ac20bcdf2ff2d36d8dc832faae4d"} Dec 08 09:18:31 crc kubenswrapper[4776]: I1208 09:18:31.292804 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gt7wq" event={"ID":"4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6","Type":"ContainerStarted","Data":"1dd7b7425e8c70d8dcb7af20f4b944dd824265873efccbd4803424d5e0168972"} Dec 08 09:18:31 crc kubenswrapper[4776]: I1208 09:18:31.293011 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-gt7wq" Dec 08 09:18:31 crc kubenswrapper[4776]: I1208 09:18:31.303578 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-gfgfc" event={"ID":"f696bcbd-7230-43d3-b09e-645d489eacf3","Type":"ContainerStarted","Data":"b408271fdb416ec4004b47fd4a2e302c925bc4ad9ab41659d5de7241f65776fe"} Dec 08 09:18:31 crc kubenswrapper[4776]: I1208 09:18:31.304014 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-gfgfc" Dec 08 09:18:31 crc kubenswrapper[4776]: I1208 09:18:31.346597 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-gfgfc" podStartSLOduration=3.346575767 podStartE2EDuration="3.346575767s" podCreationTimestamp="2025-12-08 09:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:18:31.339719033 +0000 UTC m=+1187.602944075" watchObservedRunningTime="2025-12-08 09:18:31.346575767 +0000 UTC m=+1187.609800799" Dec 08 09:18:31 crc kubenswrapper[4776]: I1208 09:18:31.348647 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-gt7wq" podStartSLOduration=3.348634152 podStartE2EDuration="3.348634152s" podCreationTimestamp="2025-12-08 09:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:18:31.324645466 +0000 UTC m=+1187.587870508" watchObservedRunningTime="2025-12-08 09:18:31.348634152 +0000 UTC m=+1187.611859174" Dec 08 09:18:37 crc kubenswrapper[4776]: I1208 09:18:37.368003 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-c68cp" event={"ID":"640e92fd-0236-408e-95ba-a5aacfe784d4","Type":"ContainerStarted","Data":"1ed8d16b4a0a13b8adb32ee8188e2823b226424e40cfee0e04fc335ddb2614ab"} Dec 08 09:18:37 crc kubenswrapper[4776]: I1208 09:18:37.369401 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-c68cp" Dec 08 09:18:37 crc kubenswrapper[4776]: I1208 09:18:37.371827 4776 generic.go:334] "Generic (PLEG): container finished" podID="12fb1453-ed3a-4c22-b33b-8c8e5402de93" containerID="6a4ea989fc9a6973881a775f9ac915b10f16bf7ce34a3169a1a5ee3ba83e5cfb" exitCode=0 Dec 08 09:18:37 crc kubenswrapper[4776]: I1208 09:18:37.371937 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ph98m" event={"ID":"12fb1453-ed3a-4c22-b33b-8c8e5402de93","Type":"ContainerDied","Data":"6a4ea989fc9a6973881a775f9ac915b10f16bf7ce34a3169a1a5ee3ba83e5cfb"} Dec 08 09:18:37 crc kubenswrapper[4776]: I1208 09:18:37.397321 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-c68cp" podStartSLOduration=2.177461753 podStartE2EDuration="9.397260418s" podCreationTimestamp="2025-12-08 09:18:28 +0000 UTC" firstStartedPulling="2025-12-08 09:18:29.351870797 +0000 UTC m=+1185.615095819" lastFinishedPulling="2025-12-08 09:18:36.571669462 +0000 UTC m=+1192.834894484" observedRunningTime="2025-12-08 09:18:37.387621999 +0000 UTC m=+1193.650847051" watchObservedRunningTime="2025-12-08 09:18:37.397260418 +0000 UTC m=+1193.660485460" Dec 08 09:18:38 crc kubenswrapper[4776]: I1208 09:18:38.382743 4776 generic.go:334] "Generic (PLEG): container finished" podID="12fb1453-ed3a-4c22-b33b-8c8e5402de93" containerID="9253b728d070407da704ddb36100f73edfee477a4c1d2921bfd0b56950cd1193" exitCode=0 Dec 08 09:18:38 crc kubenswrapper[4776]: I1208 09:18:38.382832 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ph98m" event={"ID":"12fb1453-ed3a-4c22-b33b-8c8e5402de93","Type":"ContainerDied","Data":"9253b728d070407da704ddb36100f73edfee477a4c1d2921bfd0b56950cd1193"} Dec 08 09:18:39 crc kubenswrapper[4776]: I1208 09:18:39.391927 4776 generic.go:334] "Generic (PLEG): container finished" podID="12fb1453-ed3a-4c22-b33b-8c8e5402de93" containerID="6a7b1fab34cd1da452151834ba4c2a9e7b0ba374100001b13e55acf95a8a8c87" exitCode=0 Dec 08 09:18:39 crc kubenswrapper[4776]: I1208 09:18:39.392041 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ph98m" event={"ID":"12fb1453-ed3a-4c22-b33b-8c8e5402de93","Type":"ContainerDied","Data":"6a7b1fab34cd1da452151834ba4c2a9e7b0ba374100001b13e55acf95a8a8c87"} Dec 08 09:18:40 crc kubenswrapper[4776]: I1208 09:18:40.405667 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ph98m" event={"ID":"12fb1453-ed3a-4c22-b33b-8c8e5402de93","Type":"ContainerStarted","Data":"7f4e23da126ae1743bf7d81e5b55770dfe9a29e8cbd0aab4ee764ca2a7b9a7ef"} Dec 08 09:18:40 crc kubenswrapper[4776]: I1208 09:18:40.406034 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ph98m" event={"ID":"12fb1453-ed3a-4c22-b33b-8c8e5402de93","Type":"ContainerStarted","Data":"68228c3b9d3cccd2ec3a10f8608d8ef171893ae2169221e54591df74ff5c2ce6"} Dec 08 09:18:40 crc kubenswrapper[4776]: I1208 09:18:40.406047 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ph98m" event={"ID":"12fb1453-ed3a-4c22-b33b-8c8e5402de93","Type":"ContainerStarted","Data":"c1540a356b19275655b4d74d72a2ec2df7e9663579b9d454dd99615ed4d7cc0b"} Dec 08 09:18:40 crc kubenswrapper[4776]: I1208 09:18:40.406056 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ph98m" event={"ID":"12fb1453-ed3a-4c22-b33b-8c8e5402de93","Type":"ContainerStarted","Data":"e85609e15017c8d03a45ba73a78db9f245c286934cef05a5a6f4776cac9d7a1e"} Dec 08 09:18:40 crc kubenswrapper[4776]: I1208 09:18:40.406068 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ph98m" event={"ID":"12fb1453-ed3a-4c22-b33b-8c8e5402de93","Type":"ContainerStarted","Data":"bc847521fc500aa95a153c95c54d9b119b8206626082f24b12b7a74f1b09a74b"} Dec 08 09:18:40 crc kubenswrapper[4776]: I1208 09:18:40.477819 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-gt7wq" Dec 08 09:18:41 crc kubenswrapper[4776]: I1208 09:18:41.418136 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ph98m" event={"ID":"12fb1453-ed3a-4c22-b33b-8c8e5402de93","Type":"ContainerStarted","Data":"41366494e10f31ccbf2aa771bfe95f68e2eacfcbb2168d48595c86ab991d4887"} Dec 08 09:18:41 crc kubenswrapper[4776]: I1208 09:18:41.418671 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:41 crc kubenswrapper[4776]: I1208 09:18:41.441158 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-ph98m" podStartSLOduration=5.932309078 podStartE2EDuration="13.441139395s" podCreationTimestamp="2025-12-08 09:18:28 +0000 UTC" firstStartedPulling="2025-12-08 09:18:29.057824901 +0000 UTC m=+1185.321049923" lastFinishedPulling="2025-12-08 09:18:36.566655218 +0000 UTC m=+1192.829880240" observedRunningTime="2025-12-08 09:18:41.437206678 +0000 UTC m=+1197.700431710" watchObservedRunningTime="2025-12-08 09:18:41.441139395 +0000 UTC m=+1197.704364417" Dec 08 09:18:43 crc kubenswrapper[4776]: I1208 09:18:43.409317 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-w45br"] Dec 08 09:18:43 crc kubenswrapper[4776]: I1208 09:18:43.410735 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-w45br" Dec 08 09:18:43 crc kubenswrapper[4776]: I1208 09:18:43.413576 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-5b25x" Dec 08 09:18:43 crc kubenswrapper[4776]: I1208 09:18:43.413906 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 08 09:18:43 crc kubenswrapper[4776]: I1208 09:18:43.414068 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 08 09:18:43 crc kubenswrapper[4776]: I1208 09:18:43.436402 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-w45br"] Dec 08 09:18:43 crc kubenswrapper[4776]: I1208 09:18:43.496613 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmnlm\" (UniqueName: \"kubernetes.io/projected/c0ba1530-87bd-47d4-8fdf-6aafed54d72d-kube-api-access-dmnlm\") pod \"openstack-operator-index-w45br\" (UID: \"c0ba1530-87bd-47d4-8fdf-6aafed54d72d\") " pod="openstack-operators/openstack-operator-index-w45br" Dec 08 09:18:43 crc kubenswrapper[4776]: I1208 09:18:43.598533 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmnlm\" (UniqueName: \"kubernetes.io/projected/c0ba1530-87bd-47d4-8fdf-6aafed54d72d-kube-api-access-dmnlm\") pod \"openstack-operator-index-w45br\" (UID: \"c0ba1530-87bd-47d4-8fdf-6aafed54d72d\") " pod="openstack-operators/openstack-operator-index-w45br" Dec 08 09:18:43 crc kubenswrapper[4776]: I1208 09:18:43.625603 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmnlm\" (UniqueName: \"kubernetes.io/projected/c0ba1530-87bd-47d4-8fdf-6aafed54d72d-kube-api-access-dmnlm\") pod \"openstack-operator-index-w45br\" (UID: \"c0ba1530-87bd-47d4-8fdf-6aafed54d72d\") " pod="openstack-operators/openstack-operator-index-w45br" Dec 08 09:18:43 crc kubenswrapper[4776]: I1208 09:18:43.736900 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-w45br" Dec 08 09:18:43 crc kubenswrapper[4776]: I1208 09:18:43.908569 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:43 crc kubenswrapper[4776]: I1208 09:18:43.963841 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-ph98m" Dec 08 09:18:44 crc kubenswrapper[4776]: I1208 09:18:44.298391 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-w45br"] Dec 08 09:18:44 crc kubenswrapper[4776]: I1208 09:18:44.444557 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-w45br" event={"ID":"c0ba1530-87bd-47d4-8fdf-6aafed54d72d","Type":"ContainerStarted","Data":"41e4d8e4256476bbb8099d64a98ceed08482d42bda7de9ebebb11a48b935ce4d"} Dec 08 09:18:45 crc kubenswrapper[4776]: I1208 09:18:45.804873 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-w45br"] Dec 08 09:18:46 crc kubenswrapper[4776]: I1208 09:18:46.399074 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ndvrw"] Dec 08 09:18:46 crc kubenswrapper[4776]: I1208 09:18:46.400912 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ndvrw" Dec 08 09:18:46 crc kubenswrapper[4776]: I1208 09:18:46.411590 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ndvrw"] Dec 08 09:18:46 crc kubenswrapper[4776]: I1208 09:18:46.466130 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dh7v\" (UniqueName: \"kubernetes.io/projected/4bb0bbd1-4377-4f99-b0f3-e657e4c2a792-kube-api-access-8dh7v\") pod \"openstack-operator-index-ndvrw\" (UID: \"4bb0bbd1-4377-4f99-b0f3-e657e4c2a792\") " pod="openstack-operators/openstack-operator-index-ndvrw" Dec 08 09:18:46 crc kubenswrapper[4776]: I1208 09:18:46.568117 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dh7v\" (UniqueName: \"kubernetes.io/projected/4bb0bbd1-4377-4f99-b0f3-e657e4c2a792-kube-api-access-8dh7v\") pod \"openstack-operator-index-ndvrw\" (UID: \"4bb0bbd1-4377-4f99-b0f3-e657e4c2a792\") " pod="openstack-operators/openstack-operator-index-ndvrw" Dec 08 09:18:46 crc kubenswrapper[4776]: I1208 09:18:46.587562 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dh7v\" (UniqueName: \"kubernetes.io/projected/4bb0bbd1-4377-4f99-b0f3-e657e4c2a792-kube-api-access-8dh7v\") pod \"openstack-operator-index-ndvrw\" (UID: \"4bb0bbd1-4377-4f99-b0f3-e657e4c2a792\") " pod="openstack-operators/openstack-operator-index-ndvrw" Dec 08 09:18:46 crc kubenswrapper[4776]: I1208 09:18:46.722485 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ndvrw" Dec 08 09:18:48 crc kubenswrapper[4776]: I1208 09:18:48.913165 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-c68cp" Dec 08 09:18:49 crc kubenswrapper[4776]: I1208 09:18:49.599041 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-gfgfc" Dec 08 09:18:50 crc kubenswrapper[4776]: I1208 09:18:50.491405 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-w45br" event={"ID":"c0ba1530-87bd-47d4-8fdf-6aafed54d72d","Type":"ContainerStarted","Data":"8e795b195cefd9c15b612021992c31d44000a6bcad65231067ba34aa2216dfd4"} Dec 08 09:18:50 crc kubenswrapper[4776]: I1208 09:18:50.491533 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-w45br" podUID="c0ba1530-87bd-47d4-8fdf-6aafed54d72d" containerName="registry-server" containerID="cri-o://8e795b195cefd9c15b612021992c31d44000a6bcad65231067ba34aa2216dfd4" gracePeriod=2 Dec 08 09:18:50 crc kubenswrapper[4776]: I1208 09:18:50.511750 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-w45br" podStartSLOduration=1.628778408 podStartE2EDuration="7.511731555s" podCreationTimestamp="2025-12-08 09:18:43 +0000 UTC" firstStartedPulling="2025-12-08 09:18:44.307089439 +0000 UTC m=+1200.570314481" lastFinishedPulling="2025-12-08 09:18:50.190042606 +0000 UTC m=+1206.453267628" observedRunningTime="2025-12-08 09:18:50.50670606 +0000 UTC m=+1206.769931102" watchObservedRunningTime="2025-12-08 09:18:50.511731555 +0000 UTC m=+1206.774956577" Dec 08 09:18:50 crc kubenswrapper[4776]: I1208 09:18:50.552081 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ndvrw"] Dec 08 09:18:51 crc kubenswrapper[4776]: I1208 09:18:51.208962 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-w45br" Dec 08 09:18:51 crc kubenswrapper[4776]: I1208 09:18:51.351998 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmnlm\" (UniqueName: \"kubernetes.io/projected/c0ba1530-87bd-47d4-8fdf-6aafed54d72d-kube-api-access-dmnlm\") pod \"c0ba1530-87bd-47d4-8fdf-6aafed54d72d\" (UID: \"c0ba1530-87bd-47d4-8fdf-6aafed54d72d\") " Dec 08 09:18:51 crc kubenswrapper[4776]: I1208 09:18:51.357109 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0ba1530-87bd-47d4-8fdf-6aafed54d72d-kube-api-access-dmnlm" (OuterVolumeSpecName: "kube-api-access-dmnlm") pod "c0ba1530-87bd-47d4-8fdf-6aafed54d72d" (UID: "c0ba1530-87bd-47d4-8fdf-6aafed54d72d"). InnerVolumeSpecName "kube-api-access-dmnlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:18:51 crc kubenswrapper[4776]: I1208 09:18:51.454765 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmnlm\" (UniqueName: \"kubernetes.io/projected/c0ba1530-87bd-47d4-8fdf-6aafed54d72d-kube-api-access-dmnlm\") on node \"crc\" DevicePath \"\"" Dec 08 09:18:51 crc kubenswrapper[4776]: I1208 09:18:51.501457 4776 generic.go:334] "Generic (PLEG): container finished" podID="c0ba1530-87bd-47d4-8fdf-6aafed54d72d" containerID="8e795b195cefd9c15b612021992c31d44000a6bcad65231067ba34aa2216dfd4" exitCode=0 Dec 08 09:18:51 crc kubenswrapper[4776]: I1208 09:18:51.501504 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-w45br" Dec 08 09:18:51 crc kubenswrapper[4776]: I1208 09:18:51.501547 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-w45br" event={"ID":"c0ba1530-87bd-47d4-8fdf-6aafed54d72d","Type":"ContainerDied","Data":"8e795b195cefd9c15b612021992c31d44000a6bcad65231067ba34aa2216dfd4"} Dec 08 09:18:51 crc kubenswrapper[4776]: I1208 09:18:51.501602 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-w45br" event={"ID":"c0ba1530-87bd-47d4-8fdf-6aafed54d72d","Type":"ContainerDied","Data":"41e4d8e4256476bbb8099d64a98ceed08482d42bda7de9ebebb11a48b935ce4d"} Dec 08 09:18:51 crc kubenswrapper[4776]: I1208 09:18:51.501623 4776 scope.go:117] "RemoveContainer" containerID="8e795b195cefd9c15b612021992c31d44000a6bcad65231067ba34aa2216dfd4" Dec 08 09:18:51 crc kubenswrapper[4776]: I1208 09:18:51.502736 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ndvrw" event={"ID":"4bb0bbd1-4377-4f99-b0f3-e657e4c2a792","Type":"ContainerStarted","Data":"9788b924ad1aa82d86fdb747e8c4d542feb0b82a7089dbed8cfc34d80bd3eb82"} Dec 08 09:18:51 crc kubenswrapper[4776]: I1208 09:18:51.526725 4776 scope.go:117] "RemoveContainer" containerID="8e795b195cefd9c15b612021992c31d44000a6bcad65231067ba34aa2216dfd4" Dec 08 09:18:51 crc kubenswrapper[4776]: E1208 09:18:51.527200 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e795b195cefd9c15b612021992c31d44000a6bcad65231067ba34aa2216dfd4\": container with ID starting with 8e795b195cefd9c15b612021992c31d44000a6bcad65231067ba34aa2216dfd4 not found: ID does not exist" containerID="8e795b195cefd9c15b612021992c31d44000a6bcad65231067ba34aa2216dfd4" Dec 08 09:18:51 crc kubenswrapper[4776]: I1208 09:18:51.527230 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e795b195cefd9c15b612021992c31d44000a6bcad65231067ba34aa2216dfd4"} err="failed to get container status \"8e795b195cefd9c15b612021992c31d44000a6bcad65231067ba34aa2216dfd4\": rpc error: code = NotFound desc = could not find container \"8e795b195cefd9c15b612021992c31d44000a6bcad65231067ba34aa2216dfd4\": container with ID starting with 8e795b195cefd9c15b612021992c31d44000a6bcad65231067ba34aa2216dfd4 not found: ID does not exist" Dec 08 09:18:51 crc kubenswrapper[4776]: I1208 09:18:51.532613 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-w45br"] Dec 08 09:18:51 crc kubenswrapper[4776]: I1208 09:18:51.542634 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-w45br"] Dec 08 09:18:52 crc kubenswrapper[4776]: I1208 09:18:52.362756 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0ba1530-87bd-47d4-8fdf-6aafed54d72d" path="/var/lib/kubelet/pods/c0ba1530-87bd-47d4-8fdf-6aafed54d72d/volumes" Dec 08 09:18:53 crc kubenswrapper[4776]: I1208 09:18:53.519190 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ndvrw" event={"ID":"4bb0bbd1-4377-4f99-b0f3-e657e4c2a792","Type":"ContainerStarted","Data":"0df682d1c964edfec0081c456fdd736b39dd35b7844423436bb210e16f2c15f2"} Dec 08 09:18:53 crc kubenswrapper[4776]: I1208 09:18:53.538782 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ndvrw" podStartSLOduration=7.481534796 podStartE2EDuration="7.538764396s" podCreationTimestamp="2025-12-08 09:18:46 +0000 UTC" firstStartedPulling="2025-12-08 09:18:50.602632823 +0000 UTC m=+1206.865857845" lastFinishedPulling="2025-12-08 09:18:50.659862413 +0000 UTC m=+1206.923087445" observedRunningTime="2025-12-08 09:18:53.537803571 +0000 UTC m=+1209.801028593" watchObservedRunningTime="2025-12-08 09:18:53.538764396 +0000 UTC m=+1209.801989418" Dec 08 09:18:56 crc kubenswrapper[4776]: I1208 09:18:56.723381 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-ndvrw" Dec 08 09:18:56 crc kubenswrapper[4776]: I1208 09:18:56.724080 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-ndvrw" Dec 08 09:18:56 crc kubenswrapper[4776]: I1208 09:18:56.755961 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-ndvrw" Dec 08 09:18:57 crc kubenswrapper[4776]: I1208 09:18:57.568909 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-ndvrw" Dec 08 09:18:58 crc kubenswrapper[4776]: I1208 09:18:58.911638 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-ph98m" Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.052347 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67"] Dec 08 09:19:03 crc kubenswrapper[4776]: E1208 09:19:03.053030 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0ba1530-87bd-47d4-8fdf-6aafed54d72d" containerName="registry-server" Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.053042 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0ba1530-87bd-47d4-8fdf-6aafed54d72d" containerName="registry-server" Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.053284 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0ba1530-87bd-47d4-8fdf-6aafed54d72d" containerName="registry-server" Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.054482 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.074343 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67"] Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.079226 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-92x6m" Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.182409 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d003bbac-1fa9-4696-aded-39e4b8d211ff-bundle\") pod \"16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67\" (UID: \"d003bbac-1fa9-4696-aded-39e4b8d211ff\") " pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.182521 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d003bbac-1fa9-4696-aded-39e4b8d211ff-util\") pod \"16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67\" (UID: \"d003bbac-1fa9-4696-aded-39e4b8d211ff\") " pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.182565 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwx4s\" (UniqueName: \"kubernetes.io/projected/d003bbac-1fa9-4696-aded-39e4b8d211ff-kube-api-access-cwx4s\") pod \"16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67\" (UID: \"d003bbac-1fa9-4696-aded-39e4b8d211ff\") " pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.284143 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwx4s\" (UniqueName: \"kubernetes.io/projected/d003bbac-1fa9-4696-aded-39e4b8d211ff-kube-api-access-cwx4s\") pod \"16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67\" (UID: \"d003bbac-1fa9-4696-aded-39e4b8d211ff\") " pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.284227 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d003bbac-1fa9-4696-aded-39e4b8d211ff-bundle\") pod \"16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67\" (UID: \"d003bbac-1fa9-4696-aded-39e4b8d211ff\") " pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.284299 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d003bbac-1fa9-4696-aded-39e4b8d211ff-util\") pod \"16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67\" (UID: \"d003bbac-1fa9-4696-aded-39e4b8d211ff\") " pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.284794 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d003bbac-1fa9-4696-aded-39e4b8d211ff-bundle\") pod \"16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67\" (UID: \"d003bbac-1fa9-4696-aded-39e4b8d211ff\") " pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.285614 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d003bbac-1fa9-4696-aded-39e4b8d211ff-util\") pod \"16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67\" (UID: \"d003bbac-1fa9-4696-aded-39e4b8d211ff\") " pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.307442 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwx4s\" (UniqueName: \"kubernetes.io/projected/d003bbac-1fa9-4696-aded-39e4b8d211ff-kube-api-access-cwx4s\") pod \"16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67\" (UID: \"d003bbac-1fa9-4696-aded-39e4b8d211ff\") " pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.383838 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" Dec 08 09:19:03 crc kubenswrapper[4776]: I1208 09:19:03.825814 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67"] Dec 08 09:19:03 crc kubenswrapper[4776]: W1208 09:19:03.832162 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd003bbac_1fa9_4696_aded_39e4b8d211ff.slice/crio-85177c464d26d4286f2ecfc3d6cd774a9aa3c2eef86e281cc320401701bf62bd WatchSource:0}: Error finding container 85177c464d26d4286f2ecfc3d6cd774a9aa3c2eef86e281cc320401701bf62bd: Status 404 returned error can't find the container with id 85177c464d26d4286f2ecfc3d6cd774a9aa3c2eef86e281cc320401701bf62bd Dec 08 09:19:04 crc kubenswrapper[4776]: I1208 09:19:04.598807 4776 generic.go:334] "Generic (PLEG): container finished" podID="d003bbac-1fa9-4696-aded-39e4b8d211ff" containerID="6f94c203eadf7eaec4c83b1b6c27833d8ba6c0103184666c7642f6daa0f0a3df" exitCode=0 Dec 08 09:19:04 crc kubenswrapper[4776]: I1208 09:19:04.598864 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" event={"ID":"d003bbac-1fa9-4696-aded-39e4b8d211ff","Type":"ContainerDied","Data":"6f94c203eadf7eaec4c83b1b6c27833d8ba6c0103184666c7642f6daa0f0a3df"} Dec 08 09:19:04 crc kubenswrapper[4776]: I1208 09:19:04.598904 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" event={"ID":"d003bbac-1fa9-4696-aded-39e4b8d211ff","Type":"ContainerStarted","Data":"85177c464d26d4286f2ecfc3d6cd774a9aa3c2eef86e281cc320401701bf62bd"} Dec 08 09:19:06 crc kubenswrapper[4776]: I1208 09:19:06.621057 4776 generic.go:334] "Generic (PLEG): container finished" podID="d003bbac-1fa9-4696-aded-39e4b8d211ff" containerID="82594b619541cf475e6d3e4d089383d9512ef804918011ebd5196463c0a4a789" exitCode=0 Dec 08 09:19:06 crc kubenswrapper[4776]: I1208 09:19:06.621116 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" event={"ID":"d003bbac-1fa9-4696-aded-39e4b8d211ff","Type":"ContainerDied","Data":"82594b619541cf475e6d3e4d089383d9512ef804918011ebd5196463c0a4a789"} Dec 08 09:19:07 crc kubenswrapper[4776]: I1208 09:19:07.631185 4776 generic.go:334] "Generic (PLEG): container finished" podID="d003bbac-1fa9-4696-aded-39e4b8d211ff" containerID="781bf8dccd4e0cdca42aa5cedc4f6b321a06ac8b94d8a22ef551baeb84277345" exitCode=0 Dec 08 09:19:07 crc kubenswrapper[4776]: I1208 09:19:07.631233 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" event={"ID":"d003bbac-1fa9-4696-aded-39e4b8d211ff","Type":"ContainerDied","Data":"781bf8dccd4e0cdca42aa5cedc4f6b321a06ac8b94d8a22ef551baeb84277345"} Dec 08 09:19:08 crc kubenswrapper[4776]: I1208 09:19:08.971275 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" Dec 08 09:19:09 crc kubenswrapper[4776]: I1208 09:19:09.084051 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d003bbac-1fa9-4696-aded-39e4b8d211ff-bundle\") pod \"d003bbac-1fa9-4696-aded-39e4b8d211ff\" (UID: \"d003bbac-1fa9-4696-aded-39e4b8d211ff\") " Dec 08 09:19:09 crc kubenswrapper[4776]: I1208 09:19:09.084118 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d003bbac-1fa9-4696-aded-39e4b8d211ff-util\") pod \"d003bbac-1fa9-4696-aded-39e4b8d211ff\" (UID: \"d003bbac-1fa9-4696-aded-39e4b8d211ff\") " Dec 08 09:19:09 crc kubenswrapper[4776]: I1208 09:19:09.084165 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwx4s\" (UniqueName: \"kubernetes.io/projected/d003bbac-1fa9-4696-aded-39e4b8d211ff-kube-api-access-cwx4s\") pod \"d003bbac-1fa9-4696-aded-39e4b8d211ff\" (UID: \"d003bbac-1fa9-4696-aded-39e4b8d211ff\") " Dec 08 09:19:09 crc kubenswrapper[4776]: I1208 09:19:09.085666 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d003bbac-1fa9-4696-aded-39e4b8d211ff-bundle" (OuterVolumeSpecName: "bundle") pod "d003bbac-1fa9-4696-aded-39e4b8d211ff" (UID: "d003bbac-1fa9-4696-aded-39e4b8d211ff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:19:09 crc kubenswrapper[4776]: I1208 09:19:09.091015 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d003bbac-1fa9-4696-aded-39e4b8d211ff-kube-api-access-cwx4s" (OuterVolumeSpecName: "kube-api-access-cwx4s") pod "d003bbac-1fa9-4696-aded-39e4b8d211ff" (UID: "d003bbac-1fa9-4696-aded-39e4b8d211ff"). InnerVolumeSpecName "kube-api-access-cwx4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:19:09 crc kubenswrapper[4776]: I1208 09:19:09.108633 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d003bbac-1fa9-4696-aded-39e4b8d211ff-util" (OuterVolumeSpecName: "util") pod "d003bbac-1fa9-4696-aded-39e4b8d211ff" (UID: "d003bbac-1fa9-4696-aded-39e4b8d211ff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:19:09 crc kubenswrapper[4776]: I1208 09:19:09.186784 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwx4s\" (UniqueName: \"kubernetes.io/projected/d003bbac-1fa9-4696-aded-39e4b8d211ff-kube-api-access-cwx4s\") on node \"crc\" DevicePath \"\"" Dec 08 09:19:09 crc kubenswrapper[4776]: I1208 09:19:09.186855 4776 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d003bbac-1fa9-4696-aded-39e4b8d211ff-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:19:09 crc kubenswrapper[4776]: I1208 09:19:09.186865 4776 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d003bbac-1fa9-4696-aded-39e4b8d211ff-util\") on node \"crc\" DevicePath \"\"" Dec 08 09:19:09 crc kubenswrapper[4776]: I1208 09:19:09.663821 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" event={"ID":"d003bbac-1fa9-4696-aded-39e4b8d211ff","Type":"ContainerDied","Data":"85177c464d26d4286f2ecfc3d6cd774a9aa3c2eef86e281cc320401701bf62bd"} Dec 08 09:19:09 crc kubenswrapper[4776]: I1208 09:19:09.663903 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85177c464d26d4286f2ecfc3d6cd774a9aa3c2eef86e281cc320401701bf62bd" Dec 08 09:19:09 crc kubenswrapper[4776]: I1208 09:19:09.664057 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67" Dec 08 09:19:12 crc kubenswrapper[4776]: I1208 09:19:12.089431 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5546b8686f-m7kf9"] Dec 08 09:19:12 crc kubenswrapper[4776]: E1208 09:19:12.089709 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d003bbac-1fa9-4696-aded-39e4b8d211ff" containerName="extract" Dec 08 09:19:12 crc kubenswrapper[4776]: I1208 09:19:12.089720 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d003bbac-1fa9-4696-aded-39e4b8d211ff" containerName="extract" Dec 08 09:19:12 crc kubenswrapper[4776]: E1208 09:19:12.089740 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d003bbac-1fa9-4696-aded-39e4b8d211ff" containerName="pull" Dec 08 09:19:12 crc kubenswrapper[4776]: I1208 09:19:12.089747 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d003bbac-1fa9-4696-aded-39e4b8d211ff" containerName="pull" Dec 08 09:19:12 crc kubenswrapper[4776]: E1208 09:19:12.089759 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d003bbac-1fa9-4696-aded-39e4b8d211ff" containerName="util" Dec 08 09:19:12 crc kubenswrapper[4776]: I1208 09:19:12.089765 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d003bbac-1fa9-4696-aded-39e4b8d211ff" containerName="util" Dec 08 09:19:12 crc kubenswrapper[4776]: I1208 09:19:12.089907 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="d003bbac-1fa9-4696-aded-39e4b8d211ff" containerName="extract" Dec 08 09:19:12 crc kubenswrapper[4776]: I1208 09:19:12.090456 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5546b8686f-m7kf9" Dec 08 09:19:12 crc kubenswrapper[4776]: I1208 09:19:12.092726 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-59zm9" Dec 08 09:19:12 crc kubenswrapper[4776]: I1208 09:19:12.172294 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5546b8686f-m7kf9"] Dec 08 09:19:12 crc kubenswrapper[4776]: I1208 09:19:12.246874 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6txgq\" (UniqueName: \"kubernetes.io/projected/90449ceb-bf22-41c3-a66a-3f01c6e46edc-kube-api-access-6txgq\") pod \"openstack-operator-controller-operator-5546b8686f-m7kf9\" (UID: \"90449ceb-bf22-41c3-a66a-3f01c6e46edc\") " pod="openstack-operators/openstack-operator-controller-operator-5546b8686f-m7kf9" Dec 08 09:19:12 crc kubenswrapper[4776]: I1208 09:19:12.347761 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6txgq\" (UniqueName: \"kubernetes.io/projected/90449ceb-bf22-41c3-a66a-3f01c6e46edc-kube-api-access-6txgq\") pod \"openstack-operator-controller-operator-5546b8686f-m7kf9\" (UID: \"90449ceb-bf22-41c3-a66a-3f01c6e46edc\") " pod="openstack-operators/openstack-operator-controller-operator-5546b8686f-m7kf9" Dec 08 09:19:12 crc kubenswrapper[4776]: I1208 09:19:12.368589 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6txgq\" (UniqueName: \"kubernetes.io/projected/90449ceb-bf22-41c3-a66a-3f01c6e46edc-kube-api-access-6txgq\") pod \"openstack-operator-controller-operator-5546b8686f-m7kf9\" (UID: \"90449ceb-bf22-41c3-a66a-3f01c6e46edc\") " pod="openstack-operators/openstack-operator-controller-operator-5546b8686f-m7kf9" Dec 08 09:19:12 crc kubenswrapper[4776]: I1208 09:19:12.412549 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5546b8686f-m7kf9" Dec 08 09:19:12 crc kubenswrapper[4776]: I1208 09:19:12.926214 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5546b8686f-m7kf9"] Dec 08 09:19:13 crc kubenswrapper[4776]: I1208 09:19:13.701184 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5546b8686f-m7kf9" event={"ID":"90449ceb-bf22-41c3-a66a-3f01c6e46edc","Type":"ContainerStarted","Data":"f50ed6f3a5afe837fb3f4a835eb908d1cb7bb77c155e5a1e33a912d1d4e22cf0"} Dec 08 09:19:17 crc kubenswrapper[4776]: I1208 09:19:17.735038 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5546b8686f-m7kf9" event={"ID":"90449ceb-bf22-41c3-a66a-3f01c6e46edc","Type":"ContainerStarted","Data":"6666f5259c9f4627dbaa47609d0fcc7f2b816ecc1c560bb43f46bdd81d06c67f"} Dec 08 09:19:17 crc kubenswrapper[4776]: I1208 09:19:17.735693 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5546b8686f-m7kf9" Dec 08 09:19:17 crc kubenswrapper[4776]: I1208 09:19:17.763975 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5546b8686f-m7kf9" podStartSLOduration=1.473061801 podStartE2EDuration="5.763958637s" podCreationTimestamp="2025-12-08 09:19:12 +0000 UTC" firstStartedPulling="2025-12-08 09:19:12.939230829 +0000 UTC m=+1229.202455851" lastFinishedPulling="2025-12-08 09:19:17.230127665 +0000 UTC m=+1233.493352687" observedRunningTime="2025-12-08 09:19:17.757668167 +0000 UTC m=+1234.020893189" watchObservedRunningTime="2025-12-08 09:19:17.763958637 +0000 UTC m=+1234.027183659" Dec 08 09:19:22 crc kubenswrapper[4776]: I1208 09:19:22.417092 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5546b8686f-m7kf9" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.729451 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgfzz"] Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.731689 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgfzz" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.733812 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-vlcjl" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.739712 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-2g4ph"] Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.741294 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-2g4ph" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.744542 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-9ph67" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.745441 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgfzz"] Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.753142 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-ftb4x"] Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.756274 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-ftb4x" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.758366 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-ng7r6" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.769301 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-2g4ph"] Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.787166 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-ftb4x"] Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.799473 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz9hh\" (UniqueName: \"kubernetes.io/projected/b39e8644-6fb7-4d7c-a623-c0eadac0e896-kube-api-access-xz9hh\") pod \"barbican-operator-controller-manager-7d9dfd778-rgfzz\" (UID: \"b39e8644-6fb7-4d7c-a623-c0eadac0e896\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgfzz" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.799647 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8cpg\" (UniqueName: \"kubernetes.io/projected/ad1d3b70-6eea-46a4-bdc1-82144fe12f4a-kube-api-access-b8cpg\") pod \"cinder-operator-controller-manager-6c677c69b-2g4ph\" (UID: \"ad1d3b70-6eea-46a4-bdc1-82144fe12f4a\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-2g4ph" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.799803 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp28j\" (UniqueName: \"kubernetes.io/projected/316c9728-ccef-4981-9903-895ab86e6616-kube-api-access-jp28j\") pod \"designate-operator-controller-manager-697fb699cf-ftb4x\" (UID: \"316c9728-ccef-4981-9903-895ab86e6616\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-ftb4x" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.803776 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-897nd"] Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.805500 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-897nd" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.807653 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-6gddd" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.828148 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-897nd"] Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.853233 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-f2bnk"] Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.854657 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-f2bnk" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.859518 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-mg97g" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.859647 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4k8qf"] Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.861564 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4k8qf" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.868067 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-g5fmn" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.892261 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4k8qf"] Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.901996 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz9hh\" (UniqueName: \"kubernetes.io/projected/b39e8644-6fb7-4d7c-a623-c0eadac0e896-kube-api-access-xz9hh\") pod \"barbican-operator-controller-manager-7d9dfd778-rgfzz\" (UID: \"b39e8644-6fb7-4d7c-a623-c0eadac0e896\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgfzz" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.902252 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8cpg\" (UniqueName: \"kubernetes.io/projected/ad1d3b70-6eea-46a4-bdc1-82144fe12f4a-kube-api-access-b8cpg\") pod \"cinder-operator-controller-manager-6c677c69b-2g4ph\" (UID: \"ad1d3b70-6eea-46a4-bdc1-82144fe12f4a\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-2g4ph" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.902408 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp28j\" (UniqueName: \"kubernetes.io/projected/316c9728-ccef-4981-9903-895ab86e6616-kube-api-access-jp28j\") pod \"designate-operator-controller-manager-697fb699cf-ftb4x\" (UID: \"316c9728-ccef-4981-9903-895ab86e6616\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-ftb4x" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.902594 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt6ld\" (UniqueName: \"kubernetes.io/projected/bb123983-a71d-4eca-84e8-6c116cc9b3b6-kube-api-access-zt6ld\") pod \"glance-operator-controller-manager-5697bb5779-897nd\" (UID: \"bb123983-a71d-4eca-84e8-6c116cc9b3b6\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-897nd" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.902752 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffgsq\" (UniqueName: \"kubernetes.io/projected/beadb3ee-3cd9-4c83-ba1f-9f599cd24940-kube-api-access-ffgsq\") pod \"horizon-operator-controller-manager-68c6d99b8f-4k8qf\" (UID: \"beadb3ee-3cd9-4c83-ba1f-9f599cd24940\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4k8qf" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.902918 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52k74\" (UniqueName: \"kubernetes.io/projected/f85d592d-d82d-4c08-aafb-e9a7e68ef386-kube-api-access-52k74\") pod \"heat-operator-controller-manager-5f64f6f8bb-f2bnk\" (UID: \"f85d592d-d82d-4c08-aafb-e9a7e68ef386\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-f2bnk" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.908298 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw"] Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.968056 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp28j\" (UniqueName: \"kubernetes.io/projected/316c9728-ccef-4981-9903-895ab86e6616-kube-api-access-jp28j\") pod \"designate-operator-controller-manager-697fb699cf-ftb4x\" (UID: \"316c9728-ccef-4981-9903-895ab86e6616\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-ftb4x" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.983056 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz9hh\" (UniqueName: \"kubernetes.io/projected/b39e8644-6fb7-4d7c-a623-c0eadac0e896-kube-api-access-xz9hh\") pod \"barbican-operator-controller-manager-7d9dfd778-rgfzz\" (UID: \"b39e8644-6fb7-4d7c-a623-c0eadac0e896\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgfzz" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.988726 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8cpg\" (UniqueName: \"kubernetes.io/projected/ad1d3b70-6eea-46a4-bdc1-82144fe12f4a-kube-api-access-b8cpg\") pod \"cinder-operator-controller-manager-6c677c69b-2g4ph\" (UID: \"ad1d3b70-6eea-46a4-bdc1-82144fe12f4a\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-2g4ph" Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.996522 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-f2bnk"] Dec 08 09:19:42 crc kubenswrapper[4776]: I1208 09:19:42.996644 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.002632 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.005244 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52k74\" (UniqueName: \"kubernetes.io/projected/f85d592d-d82d-4c08-aafb-e9a7e68ef386-kube-api-access-52k74\") pod \"heat-operator-controller-manager-5f64f6f8bb-f2bnk\" (UID: \"f85d592d-d82d-4c08-aafb-e9a7e68ef386\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-f2bnk" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.005326 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lz88\" (UniqueName: \"kubernetes.io/projected/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-kube-api-access-5lz88\") pod \"infra-operator-controller-manager-78d48bff9d-87pfw\" (UID: \"dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.005389 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt6ld\" (UniqueName: \"kubernetes.io/projected/bb123983-a71d-4eca-84e8-6c116cc9b3b6-kube-api-access-zt6ld\") pod \"glance-operator-controller-manager-5697bb5779-897nd\" (UID: \"bb123983-a71d-4eca-84e8-6c116cc9b3b6\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-897nd" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.005434 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert\") pod \"infra-operator-controller-manager-78d48bff9d-87pfw\" (UID: \"dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.005260 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5tn42" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.005586 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffgsq\" (UniqueName: \"kubernetes.io/projected/beadb3ee-3cd9-4c83-ba1f-9f599cd24940-kube-api-access-ffgsq\") pod \"horizon-operator-controller-manager-68c6d99b8f-4k8qf\" (UID: \"beadb3ee-3cd9-4c83-ba1f-9f599cd24940\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4k8qf" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.051821 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-4dj2x"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.054227 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-4dj2x" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.058280 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffgsq\" (UniqueName: \"kubernetes.io/projected/beadb3ee-3cd9-4c83-ba1f-9f599cd24940-kube-api-access-ffgsq\") pod \"horizon-operator-controller-manager-68c6d99b8f-4k8qf\" (UID: \"beadb3ee-3cd9-4c83-ba1f-9f599cd24940\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4k8qf" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.058774 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt6ld\" (UniqueName: \"kubernetes.io/projected/bb123983-a71d-4eca-84e8-6c116cc9b3b6-kube-api-access-zt6ld\") pod \"glance-operator-controller-manager-5697bb5779-897nd\" (UID: \"bb123983-a71d-4eca-84e8-6c116cc9b3b6\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-897nd" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.058782 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-nhscq" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.059521 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgfzz" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.069129 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.075520 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52k74\" (UniqueName: \"kubernetes.io/projected/f85d592d-d82d-4c08-aafb-e9a7e68ef386-kube-api-access-52k74\") pod \"heat-operator-controller-manager-5f64f6f8bb-f2bnk\" (UID: \"f85d592d-d82d-4c08-aafb-e9a7e68ef386\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-f2bnk" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.075552 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-2g4ph" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.089967 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-ftb4x" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.112099 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lz88\" (UniqueName: \"kubernetes.io/projected/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-kube-api-access-5lz88\") pod \"infra-operator-controller-manager-78d48bff9d-87pfw\" (UID: \"dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.112154 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvcjs\" (UniqueName: \"kubernetes.io/projected/422088d1-15c7-4791-b0c9-a12a2c5e2880-kube-api-access-zvcjs\") pod \"ironic-operator-controller-manager-967d97867-4dj2x\" (UID: \"422088d1-15c7-4791-b0c9-a12a2c5e2880\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-4dj2x" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.112254 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert\") pod \"infra-operator-controller-manager-78d48bff9d-87pfw\" (UID: \"dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" Dec 08 09:19:43 crc kubenswrapper[4776]: E1208 09:19:43.112397 4776 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 08 09:19:43 crc kubenswrapper[4776]: E1208 09:19:43.112446 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert podName:dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:43.612429249 +0000 UTC m=+1259.875654271 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert") pod "infra-operator-controller-manager-78d48bff9d-87pfw" (UID: "dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0") : secret "infra-operator-webhook-server-cert" not found Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.132361 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-897nd" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.134773 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lz88\" (UniqueName: \"kubernetes.io/projected/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-kube-api-access-5lz88\") pod \"infra-operator-controller-manager-78d48bff9d-87pfw\" (UID: \"dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.146361 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-4dj2x"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.161003 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-7smkr"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.162905 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7smkr" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.165235 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-bh6cj" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.169397 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-g66m2"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.172572 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-g66m2" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.174560 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-8qnpv" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.181116 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-f2bnk" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.182460 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-7smkr"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.194543 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jgmdb"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.195842 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jgmdb" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.197855 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4k8qf" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.200520 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-b6qbm" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.203224 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-g66m2"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.222014 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7wc2\" (UniqueName: \"kubernetes.io/projected/2a4ffe83-5f4d-4a7a-a2b6-64d12bd8f3f9-kube-api-access-r7wc2\") pod \"mariadb-operator-controller-manager-79c8c4686c-jgmdb\" (UID: \"2a4ffe83-5f4d-4a7a-a2b6-64d12bd8f3f9\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jgmdb" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.222311 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slvdw\" (UniqueName: \"kubernetes.io/projected/0f590af7-17bd-46c4-8a25-ba3a368c6382-kube-api-access-slvdw\") pod \"keystone-operator-controller-manager-7765d96ddf-7smkr\" (UID: \"0f590af7-17bd-46c4-8a25-ba3a368c6382\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7smkr" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.222367 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9wpw\" (UniqueName: \"kubernetes.io/projected/ff110975-7e1d-4d6d-bd10-b666cd8fe98b-kube-api-access-l9wpw\") pod \"manila-operator-controller-manager-5b5fd79c9c-g66m2\" (UID: \"ff110975-7e1d-4d6d-bd10-b666cd8fe98b\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-g66m2" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.222504 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvcjs\" (UniqueName: \"kubernetes.io/projected/422088d1-15c7-4791-b0c9-a12a2c5e2880-kube-api-access-zvcjs\") pod \"ironic-operator-controller-manager-967d97867-4dj2x\" (UID: \"422088d1-15c7-4791-b0c9-a12a2c5e2880\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-4dj2x" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.222918 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dqgnv"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.224246 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dqgnv" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.226342 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-vzxpd" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.248579 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-k928c"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.252692 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-k928c" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.255027 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-cjd52" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.261120 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvcjs\" (UniqueName: \"kubernetes.io/projected/422088d1-15c7-4791-b0c9-a12a2c5e2880-kube-api-access-zvcjs\") pod \"ironic-operator-controller-manager-967d97867-4dj2x\" (UID: \"422088d1-15c7-4791-b0c9-a12a2c5e2880\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-4dj2x" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.267105 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jgmdb"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.277421 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-l979f"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.279506 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l979f" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.284420 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-84pll" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.294458 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dqgnv"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.310591 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-k928c"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.318217 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.324650 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.337810 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.337982 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-fbxmv" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.346444 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs8rv\" (UniqueName: \"kubernetes.io/projected/288a9127-92ed-4b19-8cc5-34b1f9b51201-kube-api-access-zs8rv\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-dqgnv\" (UID: \"288a9127-92ed-4b19-8cc5-34b1f9b51201\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dqgnv" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.346541 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdwrg\" (UniqueName: \"kubernetes.io/projected/545c7a23-3539-4923-bd9e-8d64700070b5-kube-api-access-vdwrg\") pod \"nova-operator-controller-manager-697bc559fc-k928c\" (UID: \"545c7a23-3539-4923-bd9e-8d64700070b5\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-k928c" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.346688 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9wpw\" (UniqueName: \"kubernetes.io/projected/ff110975-7e1d-4d6d-bd10-b666cd8fe98b-kube-api-access-l9wpw\") pod \"manila-operator-controller-manager-5b5fd79c9c-g66m2\" (UID: \"ff110975-7e1d-4d6d-bd10-b666cd8fe98b\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-g66m2" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.346970 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7wc2\" (UniqueName: \"kubernetes.io/projected/2a4ffe83-5f4d-4a7a-a2b6-64d12bd8f3f9-kube-api-access-r7wc2\") pod \"mariadb-operator-controller-manager-79c8c4686c-jgmdb\" (UID: \"2a4ffe83-5f4d-4a7a-a2b6-64d12bd8f3f9\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jgmdb" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.347066 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slvdw\" (UniqueName: \"kubernetes.io/projected/0f590af7-17bd-46c4-8a25-ba3a368c6382-kube-api-access-slvdw\") pod \"keystone-operator-controller-manager-7765d96ddf-7smkr\" (UID: \"0f590af7-17bd-46c4-8a25-ba3a368c6382\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7smkr" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.347113 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnwqj\" (UniqueName: \"kubernetes.io/projected/d07c95ca-1871-4ba3-81e5-c7b4d86bb0f4-kube-api-access-xnwqj\") pod \"octavia-operator-controller-manager-998648c74-l979f\" (UID: \"d07c95ca-1871-4ba3-81e5-c7b4d86bb0f4\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-l979f" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.353491 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-gk9xw"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.361795 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gk9xw" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.368757 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-pqjsl" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.379132 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9wpw\" (UniqueName: \"kubernetes.io/projected/ff110975-7e1d-4d6d-bd10-b666cd8fe98b-kube-api-access-l9wpw\") pod \"manila-operator-controller-manager-5b5fd79c9c-g66m2\" (UID: \"ff110975-7e1d-4d6d-bd10-b666cd8fe98b\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-g66m2" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.381014 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-l979f"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.383917 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7wc2\" (UniqueName: \"kubernetes.io/projected/2a4ffe83-5f4d-4a7a-a2b6-64d12bd8f3f9-kube-api-access-r7wc2\") pod \"mariadb-operator-controller-manager-79c8c4686c-jgmdb\" (UID: \"2a4ffe83-5f4d-4a7a-a2b6-64d12bd8f3f9\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jgmdb" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.390047 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slvdw\" (UniqueName: \"kubernetes.io/projected/0f590af7-17bd-46c4-8a25-ba3a368c6382-kube-api-access-slvdw\") pod \"keystone-operator-controller-manager-7765d96ddf-7smkr\" (UID: \"0f590af7-17bd-46c4-8a25-ba3a368c6382\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7smkr" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.403885 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.411642 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-gk9xw"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.419095 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-mdm5f"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.420877 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mdm5f" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.430636 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-5k6tx" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.436851 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-mdm5f"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.446434 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.448399 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.449681 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnwqj\" (UniqueName: \"kubernetes.io/projected/d07c95ca-1871-4ba3-81e5-c7b4d86bb0f4-kube-api-access-xnwqj\") pod \"octavia-operator-controller-manager-998648c74-l979f\" (UID: \"d07c95ca-1871-4ba3-81e5-c7b4d86bb0f4\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-l979f" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.449719 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs8rv\" (UniqueName: \"kubernetes.io/projected/288a9127-92ed-4b19-8cc5-34b1f9b51201-kube-api-access-zs8rv\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-dqgnv\" (UID: \"288a9127-92ed-4b19-8cc5-34b1f9b51201\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dqgnv" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.449748 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdwrg\" (UniqueName: \"kubernetes.io/projected/545c7a23-3539-4923-bd9e-8d64700070b5-kube-api-access-vdwrg\") pod \"nova-operator-controller-manager-697bc559fc-k928c\" (UID: \"545c7a23-3539-4923-bd9e-8d64700070b5\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-k928c" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.449772 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f7smn5\" (UID: \"0cb0505b-eb0f-4801-841d-8a96fe29e608\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.449867 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdn87\" (UniqueName: \"kubernetes.io/projected/8cd2dc5d-1433-4660-9d65-bf49d398415f-kube-api-access-vdn87\") pod \"ovn-operator-controller-manager-b6456fdb6-gk9xw\" (UID: \"8cd2dc5d-1433-4660-9d65-bf49d398415f\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gk9xw" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.449916 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5hxx\" (UniqueName: \"kubernetes.io/projected/0cb0505b-eb0f-4801-841d-8a96fe29e608-kube-api-access-z5hxx\") pod \"openstack-baremetal-operator-controller-manager-84b575879f7smn5\" (UID: \"0cb0505b-eb0f-4801-841d-8a96fe29e608\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.452050 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-btc7l" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.455538 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-4dj2x" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.461274 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.472519 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdwrg\" (UniqueName: \"kubernetes.io/projected/545c7a23-3539-4923-bd9e-8d64700070b5-kube-api-access-vdwrg\") pod \"nova-operator-controller-manager-697bc559fc-k928c\" (UID: \"545c7a23-3539-4923-bd9e-8d64700070b5\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-k928c" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.479941 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-68f9cdc5f7-scgrq"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.481785 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-68f9cdc5f7-scgrq" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.488297 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-9c95p" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.492064 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7smkr" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.495787 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnwqj\" (UniqueName: \"kubernetes.io/projected/d07c95ca-1871-4ba3-81e5-c7b4d86bb0f4-kube-api-access-xnwqj\") pod \"octavia-operator-controller-manager-998648c74-l979f\" (UID: \"d07c95ca-1871-4ba3-81e5-c7b4d86bb0f4\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-l979f" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.495927 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-68f9cdc5f7-scgrq"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.497809 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs8rv\" (UniqueName: \"kubernetes.io/projected/288a9127-92ed-4b19-8cc5-34b1f9b51201-kube-api-access-zs8rv\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-dqgnv\" (UID: \"288a9127-92ed-4b19-8cc5-34b1f9b51201\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dqgnv" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.531190 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-g66m2" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.557385 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krpk2\" (UniqueName: \"kubernetes.io/projected/7134ec23-7ec3-454d-b837-29fbe7094067-kube-api-access-krpk2\") pod \"telemetry-operator-controller-manager-68f9cdc5f7-scgrq\" (UID: \"7134ec23-7ec3-454d-b837-29fbe7094067\") " pod="openstack-operators/telemetry-operator-controller-manager-68f9cdc5f7-scgrq" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.557572 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdn87\" (UniqueName: \"kubernetes.io/projected/8cd2dc5d-1433-4660-9d65-bf49d398415f-kube-api-access-vdn87\") pod \"ovn-operator-controller-manager-b6456fdb6-gk9xw\" (UID: \"8cd2dc5d-1433-4660-9d65-bf49d398415f\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gk9xw" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.557630 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqnhr\" (UniqueName: \"kubernetes.io/projected/482e5641-8a00-4fc3-b7d3-6eb88dbee1e4-kube-api-access-tqnhr\") pod \"placement-operator-controller-manager-78f8948974-mdm5f\" (UID: \"482e5641-8a00-4fc3-b7d3-6eb88dbee1e4\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-mdm5f" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.557681 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5hxx\" (UniqueName: \"kubernetes.io/projected/0cb0505b-eb0f-4801-841d-8a96fe29e608-kube-api-access-z5hxx\") pod \"openstack-baremetal-operator-controller-manager-84b575879f7smn5\" (UID: \"0cb0505b-eb0f-4801-841d-8a96fe29e608\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.557788 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f7smn5\" (UID: \"0cb0505b-eb0f-4801-841d-8a96fe29e608\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.557852 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl7lr\" (UniqueName: \"kubernetes.io/projected/6ea3ffdd-a922-487e-a738-da3091a1656e-kube-api-access-nl7lr\") pod \"swift-operator-controller-manager-9d58d64bc-ncfrf\" (UID: \"6ea3ffdd-a922-487e-a738-da3091a1656e\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf" Dec 08 09:19:43 crc kubenswrapper[4776]: E1208 09:19:43.559088 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 09:19:43 crc kubenswrapper[4776]: E1208 09:19:43.559166 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert podName:0cb0505b-eb0f-4801-841d-8a96fe29e608 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:44.059145049 +0000 UTC m=+1260.322370071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert") pod "openstack-baremetal-operator-controller-manager-84b575879f7smn5" (UID: "0cb0505b-eb0f-4801-841d-8a96fe29e608") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.574483 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jgmdb" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.582910 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dqgnv" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.585841 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5hxx\" (UniqueName: \"kubernetes.io/projected/0cb0505b-eb0f-4801-841d-8a96fe29e608-kube-api-access-z5hxx\") pod \"openstack-baremetal-operator-controller-manager-84b575879f7smn5\" (UID: \"0cb0505b-eb0f-4801-841d-8a96fe29e608\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.600031 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdn87\" (UniqueName: \"kubernetes.io/projected/8cd2dc5d-1433-4660-9d65-bf49d398415f-kube-api-access-vdn87\") pod \"ovn-operator-controller-manager-b6456fdb6-gk9xw\" (UID: \"8cd2dc5d-1433-4660-9d65-bf49d398415f\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gk9xw" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.608626 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-k928c" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.617344 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.618934 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.621768 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-g6h8f" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.627910 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.653294 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l979f" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.659115 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99tqm\" (UniqueName: \"kubernetes.io/projected/c8f3f832-68f1-47a2-bb3d-5d67f54655ce-kube-api-access-99tqm\") pod \"test-operator-controller-manager-5854674fcc-xtf2f\" (UID: \"c8f3f832-68f1-47a2-bb3d-5d67f54655ce\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.659263 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqnhr\" (UniqueName: \"kubernetes.io/projected/482e5641-8a00-4fc3-b7d3-6eb88dbee1e4-kube-api-access-tqnhr\") pod \"placement-operator-controller-manager-78f8948974-mdm5f\" (UID: \"482e5641-8a00-4fc3-b7d3-6eb88dbee1e4\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-mdm5f" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.659311 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert\") pod \"infra-operator-controller-manager-78d48bff9d-87pfw\" (UID: \"dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.659402 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl7lr\" (UniqueName: \"kubernetes.io/projected/6ea3ffdd-a922-487e-a738-da3091a1656e-kube-api-access-nl7lr\") pod \"swift-operator-controller-manager-9d58d64bc-ncfrf\" (UID: \"6ea3ffdd-a922-487e-a738-da3091a1656e\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.659441 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krpk2\" (UniqueName: \"kubernetes.io/projected/7134ec23-7ec3-454d-b837-29fbe7094067-kube-api-access-krpk2\") pod \"telemetry-operator-controller-manager-68f9cdc5f7-scgrq\" (UID: \"7134ec23-7ec3-454d-b837-29fbe7094067\") " pod="openstack-operators/telemetry-operator-controller-manager-68f9cdc5f7-scgrq" Dec 08 09:19:43 crc kubenswrapper[4776]: E1208 09:19:43.659706 4776 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 08 09:19:43 crc kubenswrapper[4776]: E1208 09:19:43.659754 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert podName:dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:44.659737876 +0000 UTC m=+1260.922962898 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert") pod "infra-operator-controller-manager-78d48bff9d-87pfw" (UID: "dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0") : secret "infra-operator-webhook-server-cert" not found Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.691768 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl7lr\" (UniqueName: \"kubernetes.io/projected/6ea3ffdd-a922-487e-a738-da3091a1656e-kube-api-access-nl7lr\") pod \"swift-operator-controller-manager-9d58d64bc-ncfrf\" (UID: \"6ea3ffdd-a922-487e-a738-da3091a1656e\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.693429 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqnhr\" (UniqueName: \"kubernetes.io/projected/482e5641-8a00-4fc3-b7d3-6eb88dbee1e4-kube-api-access-tqnhr\") pod \"placement-operator-controller-manager-78f8948974-mdm5f\" (UID: \"482e5641-8a00-4fc3-b7d3-6eb88dbee1e4\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-mdm5f" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.693649 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krpk2\" (UniqueName: \"kubernetes.io/projected/7134ec23-7ec3-454d-b837-29fbe7094067-kube-api-access-krpk2\") pod \"telemetry-operator-controller-manager-68f9cdc5f7-scgrq\" (UID: \"7134ec23-7ec3-454d-b837-29fbe7094067\") " pod="openstack-operators/telemetry-operator-controller-manager-68f9cdc5f7-scgrq" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.724954 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-kfz2m"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.726709 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kfz2m" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.729005 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-hnbbb" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.736868 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-kfz2m"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.761941 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99tqm\" (UniqueName: \"kubernetes.io/projected/c8f3f832-68f1-47a2-bb3d-5d67f54655ce-kube-api-access-99tqm\") pod \"test-operator-controller-manager-5854674fcc-xtf2f\" (UID: \"c8f3f832-68f1-47a2-bb3d-5d67f54655ce\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.762111 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvlbb\" (UniqueName: \"kubernetes.io/projected/61424c2d-bdc7-431a-8f12-535e1e97ce4b-kube-api-access-kvlbb\") pod \"watcher-operator-controller-manager-667bd8d554-kfz2m\" (UID: \"61424c2d-bdc7-431a-8f12-535e1e97ce4b\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kfz2m" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.779733 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.781232 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.783638 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.783643 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.783867 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-2jw97" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.788046 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99tqm\" (UniqueName: \"kubernetes.io/projected/c8f3f832-68f1-47a2-bb3d-5d67f54655ce-kube-api-access-99tqm\") pod \"test-operator-controller-manager-5854674fcc-xtf2f\" (UID: \"c8f3f832-68f1-47a2-bb3d-5d67f54655ce\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.796297 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.796920 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gk9xw" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.821414 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xxv7g"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.823959 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xxv7g" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.830530 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-qzqqh" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.836850 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mdm5f" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.839730 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xxv7g"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.864081 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmb2k\" (UniqueName: \"kubernetes.io/projected/d8a1143b-5dc6-4a99-a6e4-f155585ebbcb-kube-api-access-dmb2k\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xxv7g\" (UID: \"d8a1143b-5dc6-4a99-a6e4-f155585ebbcb\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xxv7g" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.864147 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvlbb\" (UniqueName: \"kubernetes.io/projected/61424c2d-bdc7-431a-8f12-535e1e97ce4b-kube-api-access-kvlbb\") pod \"watcher-operator-controller-manager-667bd8d554-kfz2m\" (UID: \"61424c2d-bdc7-431a-8f12-535e1e97ce4b\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kfz2m" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.864337 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.864655 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2dw5\" (UniqueName: \"kubernetes.io/projected/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-kube-api-access-j2dw5\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.864863 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.875624 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.889533 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-68f9cdc5f7-scgrq" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.895712 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgfzz"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.900513 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvlbb\" (UniqueName: \"kubernetes.io/projected/61424c2d-bdc7-431a-8f12-535e1e97ce4b-kube-api-access-kvlbb\") pod \"watcher-operator-controller-manager-667bd8d554-kfz2m\" (UID: \"61424c2d-bdc7-431a-8f12-535e1e97ce4b\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kfz2m" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.969198 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4k8qf"] Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.970961 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-g6h8f" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.977268 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2dw5\" (UniqueName: \"kubernetes.io/projected/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-kube-api-access-j2dw5\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.977393 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.977421 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmb2k\" (UniqueName: \"kubernetes.io/projected/d8a1143b-5dc6-4a99-a6e4-f155585ebbcb-kube-api-access-dmb2k\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xxv7g\" (UID: \"d8a1143b-5dc6-4a99-a6e4-f155585ebbcb\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xxv7g" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.977489 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.980746 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.980966 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.981581 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f" Dec 08 09:19:43 crc kubenswrapper[4776]: W1208 09:19:43.986334 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeadb3ee_3cd9_4c83_ba1f_9f599cd24940.slice/crio-ac8b1ff93a84691f16204d867573c129271f432f9f4fa5004568a6f3758c398b WatchSource:0}: Error finding container ac8b1ff93a84691f16204d867573c129271f432f9f4fa5004568a6f3758c398b: Status 404 returned error can't find the container with id ac8b1ff93a84691f16204d867573c129271f432f9f4fa5004568a6f3758c398b Dec 08 09:19:43 crc kubenswrapper[4776]: E1208 09:19:43.988388 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 08 09:19:43 crc kubenswrapper[4776]: E1208 09:19:43.988544 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs podName:e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:44.48847803 +0000 UTC m=+1260.751703052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs") pod "openstack-operator-controller-manager-57686cd5df-zt7pj" (UID: "e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23") : secret "webhook-server-cert" not found Dec 08 09:19:43 crc kubenswrapper[4776]: E1208 09:19:43.988644 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.988718 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-ftb4x"] Dec 08 09:19:43 crc kubenswrapper[4776]: E1208 09:19:43.988745 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs podName:e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:44.488718586 +0000 UTC m=+1260.751943708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs") pod "openstack-operator-controller-manager-57686cd5df-zt7pj" (UID: "e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23") : secret "metrics-server-cert" not found Dec 08 09:19:43 crc kubenswrapper[4776]: I1208 09:19:43.996551 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2dw5\" (UniqueName: \"kubernetes.io/projected/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-kube-api-access-j2dw5\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.004281 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-2g4ph"] Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.014551 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmb2k\" (UniqueName: \"kubernetes.io/projected/d8a1143b-5dc6-4a99-a6e4-f155585ebbcb-kube-api-access-dmb2k\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xxv7g\" (UID: \"d8a1143b-5dc6-4a99-a6e4-f155585ebbcb\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xxv7g" Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.029203 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4k8qf" event={"ID":"beadb3ee-3cd9-4c83-ba1f-9f599cd24940","Type":"ContainerStarted","Data":"ac8b1ff93a84691f16204d867573c129271f432f9f4fa5004568a6f3758c398b"} Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.030991 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgfzz" event={"ID":"b39e8644-6fb7-4d7c-a623-c0eadac0e896","Type":"ContainerStarted","Data":"057a691b273f19abd4f5b5dbd3bdaf41ed702bf4c9e16917e883a82840cc17d0"} Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.050865 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-hnbbb" Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.059083 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kfz2m" Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.079490 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f7smn5\" (UID: \"0cb0505b-eb0f-4801-841d-8a96fe29e608\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" Dec 08 09:19:44 crc kubenswrapper[4776]: E1208 09:19:44.079968 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 09:19:44 crc kubenswrapper[4776]: E1208 09:19:44.080022 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert podName:0cb0505b-eb0f-4801-841d-8a96fe29e608 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:45.080007213 +0000 UTC m=+1261.343232235 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert") pod "openstack-baremetal-operator-controller-manager-84b575879f7smn5" (UID: "0cb0505b-eb0f-4801-841d-8a96fe29e608") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.186372 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-qzqqh" Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.194780 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xxv7g" Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.416206 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-897nd"] Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.454909 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-f2bnk"] Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.494340 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.494432 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:44 crc kubenswrapper[4776]: E1208 09:19:44.495034 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 08 09:19:44 crc kubenswrapper[4776]: E1208 09:19:44.495058 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 08 09:19:44 crc kubenswrapper[4776]: E1208 09:19:44.495109 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs podName:e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:45.495097661 +0000 UTC m=+1261.758322683 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs") pod "openstack-operator-controller-manager-57686cd5df-zt7pj" (UID: "e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23") : secret "webhook-server-cert" not found Dec 08 09:19:44 crc kubenswrapper[4776]: E1208 09:19:44.495123 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs podName:e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:45.495117081 +0000 UTC m=+1261.758342103 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs") pod "openstack-operator-controller-manager-57686cd5df-zt7pj" (UID: "e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23") : secret "metrics-server-cert" not found Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.697234 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert\") pod \"infra-operator-controller-manager-78d48bff9d-87pfw\" (UID: \"dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" Dec 08 09:19:44 crc kubenswrapper[4776]: E1208 09:19:44.697441 4776 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 08 09:19:44 crc kubenswrapper[4776]: E1208 09:19:44.698405 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert podName:dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:46.69838718 +0000 UTC m=+1262.961612202 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert") pod "infra-operator-controller-manager-78d48bff9d-87pfw" (UID: "dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0") : secret "infra-operator-webhook-server-cert" not found Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.753370 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-g66m2"] Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.774239 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-4dj2x"] Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.785245 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-7smkr"] Dec 08 09:19:44 crc kubenswrapper[4776]: W1208 09:19:44.791386 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod422088d1_15c7_4791_b0c9_a12a2c5e2880.slice/crio-6082aaa09d0cea72b4e1bc103dcb867b87cc98d8a2f13f4c855e59571fc41c81 WatchSource:0}: Error finding container 6082aaa09d0cea72b4e1bc103dcb867b87cc98d8a2f13f4c855e59571fc41c81: Status 404 returned error can't find the container with id 6082aaa09d0cea72b4e1bc103dcb867b87cc98d8a2f13f4c855e59571fc41c81 Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.797719 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dqgnv"] Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.806331 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jgmdb"] Dec 08 09:19:44 crc kubenswrapper[4776]: I1208 09:19:44.812463 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-k928c"] Dec 08 09:19:44 crc kubenswrapper[4776]: W1208 09:19:44.826623 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a4ffe83_5f4d_4a7a_a2b6_64d12bd8f3f9.slice/crio-ff4dcdd534f091ef790a0c77f450b017f2d173ab5433b2d94c452553e5ccf227 WatchSource:0}: Error finding container ff4dcdd534f091ef790a0c77f450b017f2d173ab5433b2d94c452553e5ccf227: Status 404 returned error can't find the container with id ff4dcdd534f091ef790a0c77f450b017f2d173ab5433b2d94c452553e5ccf227 Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.049362 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7smkr" event={"ID":"0f590af7-17bd-46c4-8a25-ba3a368c6382","Type":"ContainerStarted","Data":"96f50443508fbcbe1557799eca3a5eda1509b37368fd3d3535891dec0092137a"} Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.053986 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-2g4ph" event={"ID":"ad1d3b70-6eea-46a4-bdc1-82144fe12f4a","Type":"ContainerStarted","Data":"447623c47828e67159952cfd914dbb863221b5b97e850ea8584bab701a3cbbda"} Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.063045 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dqgnv" event={"ID":"288a9127-92ed-4b19-8cc5-34b1f9b51201","Type":"ContainerStarted","Data":"c125af3c6fa7fac68b24928697df00dfc4cf83113cf48595bebb6d921cb54f9a"} Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.066079 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-f2bnk" event={"ID":"f85d592d-d82d-4c08-aafb-e9a7e68ef386","Type":"ContainerStarted","Data":"f7795398406d924625528a56c7fc21b8dbeeb7aa878d9d72e6b7f70fe4dc3b08"} Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.068973 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-ftb4x" event={"ID":"316c9728-ccef-4981-9903-895ab86e6616","Type":"ContainerStarted","Data":"20949ac02c1e758cebe89e6ad422160e707755f72f4de917fb25be99eeaf73f6"} Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.071397 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-k928c" event={"ID":"545c7a23-3539-4923-bd9e-8d64700070b5","Type":"ContainerStarted","Data":"be0a5eee3826dbecc7298552a1e9aa6474b94f2fb4529949717dc2528e1637b2"} Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.074703 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jgmdb" event={"ID":"2a4ffe83-5f4d-4a7a-a2b6-64d12bd8f3f9","Type":"ContainerStarted","Data":"ff4dcdd534f091ef790a0c77f450b017f2d173ab5433b2d94c452553e5ccf227"} Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.077773 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-897nd" event={"ID":"bb123983-a71d-4eca-84e8-6c116cc9b3b6","Type":"ContainerStarted","Data":"77f3ef916eafc3baba43cddb16f53c31b028c2e5e8edb9363e33fe18e05193ad"} Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.079222 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-g66m2" event={"ID":"ff110975-7e1d-4d6d-bd10-b666cd8fe98b","Type":"ContainerStarted","Data":"d7239d53ada02420f7fbbb5ff9e4a034c40f3bfa142d0b56c70f81a6e5e96d58"} Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.081074 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-4dj2x" event={"ID":"422088d1-15c7-4791-b0c9-a12a2c5e2880","Type":"ContainerStarted","Data":"6082aaa09d0cea72b4e1bc103dcb867b87cc98d8a2f13f4c855e59571fc41c81"} Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.107442 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f7smn5\" (UID: \"0cb0505b-eb0f-4801-841d-8a96fe29e608\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" Dec 08 09:19:45 crc kubenswrapper[4776]: E1208 09:19:45.107618 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 09:19:45 crc kubenswrapper[4776]: E1208 09:19:45.107660 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert podName:0cb0505b-eb0f-4801-841d-8a96fe29e608 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:47.107647151 +0000 UTC m=+1263.370872173 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert") pod "openstack-baremetal-operator-controller-manager-84b575879f7smn5" (UID: "0cb0505b-eb0f-4801-841d-8a96fe29e608") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.148308 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-gk9xw"] Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.168096 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f"] Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.177087 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-68f9cdc5f7-scgrq"] Dec 08 09:19:45 crc kubenswrapper[4776]: W1208 09:19:45.182568 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cd2dc5d_1433_4660_9d65_bf49d398415f.slice/crio-4912da68155e59a8cc68bbec740c4cb790bacf4dc92f74e2b0c529a01f58aa51 WatchSource:0}: Error finding container 4912da68155e59a8cc68bbec740c4cb790bacf4dc92f74e2b0c529a01f58aa51: Status 404 returned error can't find the container with id 4912da68155e59a8cc68bbec740c4cb790bacf4dc92f74e2b0c529a01f58aa51 Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.190561 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-mdm5f"] Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.207408 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-l979f"] Dec 08 09:19:45 crc kubenswrapper[4776]: W1208 09:19:45.255670 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8f3f832_68f1_47a2_bb3d_5d67f54655ce.slice/crio-d3b9896156f9699c5265bc470ad3ee099ae74fe466bc8a5a6d5c87051612baf6 WatchSource:0}: Error finding container d3b9896156f9699c5265bc470ad3ee099ae74fe466bc8a5a6d5c87051612baf6: Status 404 returned error can't find the container with id d3b9896156f9699c5265bc470ad3ee099ae74fe466bc8a5a6d5c87051612baf6 Dec 08 09:19:45 crc kubenswrapper[4776]: E1208 09:19:45.261912 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-99tqm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-xtf2f_openstack-operators(c8f3f832-68f1-47a2-bb3d-5d67f54655ce): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 08 09:19:45 crc kubenswrapper[4776]: E1208 09:19:45.271551 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-99tqm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-xtf2f_openstack-operators(c8f3f832-68f1-47a2-bb3d-5d67f54655ce): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 08 09:19:45 crc kubenswrapper[4776]: E1208 09:19:45.273508 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f" podUID="c8f3f832-68f1-47a2-bb3d-5d67f54655ce" Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.483263 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-kfz2m"] Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.489007 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xxv7g"] Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.497942 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf"] Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.522012 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:45 crc kubenswrapper[4776]: I1208 09:19:45.522120 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:45 crc kubenswrapper[4776]: E1208 09:19:45.522273 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 08 09:19:45 crc kubenswrapper[4776]: E1208 09:19:45.522322 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 08 09:19:45 crc kubenswrapper[4776]: E1208 09:19:45.522365 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs podName:e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:47.522334148 +0000 UTC m=+1263.785559170 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs") pod "openstack-operator-controller-manager-57686cd5df-zt7pj" (UID: "e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23") : secret "webhook-server-cert" not found Dec 08 09:19:45 crc kubenswrapper[4776]: E1208 09:19:45.522389 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs podName:e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:47.522380389 +0000 UTC m=+1263.785605411 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs") pod "openstack-operator-controller-manager-57686cd5df-zt7pj" (UID: "e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23") : secret "metrics-server-cert" not found Dec 08 09:19:45 crc kubenswrapper[4776]: E1208 09:19:45.560691 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nl7lr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-ncfrf_openstack-operators(6ea3ffdd-a922-487e-a738-da3091a1656e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 08 09:19:45 crc kubenswrapper[4776]: E1208 09:19:45.563083 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nl7lr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-ncfrf_openstack-operators(6ea3ffdd-a922-487e-a738-da3091a1656e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 08 09:19:45 crc kubenswrapper[4776]: E1208 09:19:45.564401 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf" podUID="6ea3ffdd-a922-487e-a738-da3091a1656e" Dec 08 09:19:46 crc kubenswrapper[4776]: I1208 09:19:46.114388 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kfz2m" event={"ID":"61424c2d-bdc7-431a-8f12-535e1e97ce4b","Type":"ContainerStarted","Data":"af1bf70eb9c7e21540f7018b8db65c847a39a6c6c8a24d5b139b6f4ffa5e3fb6"} Dec 08 09:19:46 crc kubenswrapper[4776]: I1208 09:19:46.116858 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf" event={"ID":"6ea3ffdd-a922-487e-a738-da3091a1656e","Type":"ContainerStarted","Data":"ec3086fd330d9086407825eabc05ceef461017891ff853507ef07f811d966c0f"} Dec 08 09:19:46 crc kubenswrapper[4776]: I1208 09:19:46.129354 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f" event={"ID":"c8f3f832-68f1-47a2-bb3d-5d67f54655ce","Type":"ContainerStarted","Data":"d3b9896156f9699c5265bc470ad3ee099ae74fe466bc8a5a6d5c87051612baf6"} Dec 08 09:19:46 crc kubenswrapper[4776]: E1208 09:19:46.149015 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf" podUID="6ea3ffdd-a922-487e-a738-da3091a1656e" Dec 08 09:19:46 crc kubenswrapper[4776]: I1208 09:19:46.149018 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l979f" event={"ID":"d07c95ca-1871-4ba3-81e5-c7b4d86bb0f4","Type":"ContainerStarted","Data":"11d94a0603abae82e903411174bff9473b5ebfc32b430a673ad74f7c1416ef76"} Dec 08 09:19:46 crc kubenswrapper[4776]: E1208 09:19:46.149137 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f" podUID="c8f3f832-68f1-47a2-bb3d-5d67f54655ce" Dec 08 09:19:46 crc kubenswrapper[4776]: I1208 09:19:46.164571 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-68f9cdc5f7-scgrq" event={"ID":"7134ec23-7ec3-454d-b837-29fbe7094067","Type":"ContainerStarted","Data":"41e119f9c2d351b106203c3e697f08bd80f51f9177883e4b457afa0c9ee56101"} Dec 08 09:19:46 crc kubenswrapper[4776]: I1208 09:19:46.190475 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mdm5f" event={"ID":"482e5641-8a00-4fc3-b7d3-6eb88dbee1e4","Type":"ContainerStarted","Data":"aaa1d3592dfad300b89e7805a4551bbeacaff730f6efb8178c982b4683c14a4f"} Dec 08 09:19:46 crc kubenswrapper[4776]: I1208 09:19:46.213501 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xxv7g" event={"ID":"d8a1143b-5dc6-4a99-a6e4-f155585ebbcb","Type":"ContainerStarted","Data":"2252184e487c2d2ab18cd97b5e42a286ff82391c4bdf2ae4af75502dee8c1eb2"} Dec 08 09:19:46 crc kubenswrapper[4776]: I1208 09:19:46.252358 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gk9xw" event={"ID":"8cd2dc5d-1433-4660-9d65-bf49d398415f","Type":"ContainerStarted","Data":"4912da68155e59a8cc68bbec740c4cb790bacf4dc92f74e2b0c529a01f58aa51"} Dec 08 09:19:46 crc kubenswrapper[4776]: I1208 09:19:46.748131 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert\") pod \"infra-operator-controller-manager-78d48bff9d-87pfw\" (UID: \"dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" Dec 08 09:19:46 crc kubenswrapper[4776]: E1208 09:19:46.748376 4776 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 08 09:19:46 crc kubenswrapper[4776]: E1208 09:19:46.748426 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert podName:dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:50.748408955 +0000 UTC m=+1267.011633977 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert") pod "infra-operator-controller-manager-78d48bff9d-87pfw" (UID: "dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0") : secret "infra-operator-webhook-server-cert" not found Dec 08 09:19:47 crc kubenswrapper[4776]: I1208 09:19:47.157309 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f7smn5\" (UID: \"0cb0505b-eb0f-4801-841d-8a96fe29e608\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" Dec 08 09:19:47 crc kubenswrapper[4776]: E1208 09:19:47.157533 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 09:19:47 crc kubenswrapper[4776]: E1208 09:19:47.157783 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert podName:0cb0505b-eb0f-4801-841d-8a96fe29e608 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:51.157759819 +0000 UTC m=+1267.420984841 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert") pod "openstack-baremetal-operator-controller-manager-84b575879f7smn5" (UID: "0cb0505b-eb0f-4801-841d-8a96fe29e608") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 09:19:47 crc kubenswrapper[4776]: E1208 09:19:47.281954 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf" podUID="6ea3ffdd-a922-487e-a738-da3091a1656e" Dec 08 09:19:47 crc kubenswrapper[4776]: E1208 09:19:47.282023 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f" podUID="c8f3f832-68f1-47a2-bb3d-5d67f54655ce" Dec 08 09:19:47 crc kubenswrapper[4776]: I1208 09:19:47.563375 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:47 crc kubenswrapper[4776]: E1208 09:19:47.563523 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 08 09:19:47 crc kubenswrapper[4776]: I1208 09:19:47.563626 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:47 crc kubenswrapper[4776]: E1208 09:19:47.563776 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs podName:e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:51.563732871 +0000 UTC m=+1267.826957893 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs") pod "openstack-operator-controller-manager-57686cd5df-zt7pj" (UID: "e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23") : secret "webhook-server-cert" not found Dec 08 09:19:47 crc kubenswrapper[4776]: E1208 09:19:47.563847 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 08 09:19:47 crc kubenswrapper[4776]: E1208 09:19:47.564026 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs podName:e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:51.564006879 +0000 UTC m=+1267.827231901 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs") pod "openstack-operator-controller-manager-57686cd5df-zt7pj" (UID: "e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23") : secret "metrics-server-cert" not found Dec 08 09:19:50 crc kubenswrapper[4776]: I1208 09:19:50.841089 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert\") pod \"infra-operator-controller-manager-78d48bff9d-87pfw\" (UID: \"dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" Dec 08 09:19:50 crc kubenswrapper[4776]: E1208 09:19:50.841270 4776 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 08 09:19:50 crc kubenswrapper[4776]: E1208 09:19:50.841812 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert podName:dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:58.841793078 +0000 UTC m=+1275.105018100 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert") pod "infra-operator-controller-manager-78d48bff9d-87pfw" (UID: "dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0") : secret "infra-operator-webhook-server-cert" not found Dec 08 09:19:51 crc kubenswrapper[4776]: I1208 09:19:51.252009 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f7smn5\" (UID: \"0cb0505b-eb0f-4801-841d-8a96fe29e608\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" Dec 08 09:19:51 crc kubenswrapper[4776]: E1208 09:19:51.252206 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 09:19:51 crc kubenswrapper[4776]: E1208 09:19:51.252260 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert podName:0cb0505b-eb0f-4801-841d-8a96fe29e608 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:59.252243731 +0000 UTC m=+1275.515468753 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert") pod "openstack-baremetal-operator-controller-manager-84b575879f7smn5" (UID: "0cb0505b-eb0f-4801-841d-8a96fe29e608") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 09:19:51 crc kubenswrapper[4776]: I1208 09:19:51.660092 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:51 crc kubenswrapper[4776]: I1208 09:19:51.660275 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:51 crc kubenswrapper[4776]: E1208 09:19:51.660359 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 08 09:19:51 crc kubenswrapper[4776]: E1208 09:19:51.660420 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 08 09:19:51 crc kubenswrapper[4776]: E1208 09:19:51.660460 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs podName:e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:59.660434984 +0000 UTC m=+1275.923660026 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs") pod "openstack-operator-controller-manager-57686cd5df-zt7pj" (UID: "e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23") : secret "metrics-server-cert" not found Dec 08 09:19:51 crc kubenswrapper[4776]: E1208 09:19:51.660488 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs podName:e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23 nodeName:}" failed. No retries permitted until 2025-12-08 09:19:59.660471775 +0000 UTC m=+1275.923696797 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs") pod "openstack-operator-controller-manager-57686cd5df-zt7pj" (UID: "e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23") : secret "webhook-server-cert" not found Dec 08 09:19:58 crc kubenswrapper[4776]: I1208 09:19:58.894980 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert\") pod \"infra-operator-controller-manager-78d48bff9d-87pfw\" (UID: \"dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" Dec 08 09:19:58 crc kubenswrapper[4776]: E1208 09:19:58.895218 4776 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 08 09:19:58 crc kubenswrapper[4776]: E1208 09:19:58.895736 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert podName:dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0 nodeName:}" failed. No retries permitted until 2025-12-08 09:20:14.895689271 +0000 UTC m=+1291.158914293 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert") pod "infra-operator-controller-manager-78d48bff9d-87pfw" (UID: "dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0") : secret "infra-operator-webhook-server-cert" not found Dec 08 09:19:59 crc kubenswrapper[4776]: I1208 09:19:59.304186 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f7smn5\" (UID: \"0cb0505b-eb0f-4801-841d-8a96fe29e608\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" Dec 08 09:19:59 crc kubenswrapper[4776]: E1208 09:19:59.305083 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 09:19:59 crc kubenswrapper[4776]: E1208 09:19:59.305204 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert podName:0cb0505b-eb0f-4801-841d-8a96fe29e608 nodeName:}" failed. No retries permitted until 2025-12-08 09:20:15.305156557 +0000 UTC m=+1291.568381579 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert") pod "openstack-baremetal-operator-controller-manager-84b575879f7smn5" (UID: "0cb0505b-eb0f-4801-841d-8a96fe29e608") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 09:19:59 crc kubenswrapper[4776]: I1208 09:19:59.712329 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:59 crc kubenswrapper[4776]: I1208 09:19:59.712425 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:19:59 crc kubenswrapper[4776]: E1208 09:19:59.712493 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 08 09:19:59 crc kubenswrapper[4776]: E1208 09:19:59.712544 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 08 09:19:59 crc kubenswrapper[4776]: E1208 09:19:59.712565 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs podName:e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23 nodeName:}" failed. No retries permitted until 2025-12-08 09:20:15.712549939 +0000 UTC m=+1291.975774961 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs") pod "openstack-operator-controller-manager-57686cd5df-zt7pj" (UID: "e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23") : secret "webhook-server-cert" not found Dec 08 09:19:59 crc kubenswrapper[4776]: E1208 09:19:59.712579 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs podName:e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23 nodeName:}" failed. No retries permitted until 2025-12-08 09:20:15.712573799 +0000 UTC m=+1291.975798811 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs") pod "openstack-operator-controller-manager-57686cd5df-zt7pj" (UID: "e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23") : secret "metrics-server-cert" not found Dec 08 09:20:04 crc kubenswrapper[4776]: E1208 09:20:04.626162 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027" Dec 08 09:20:04 crc kubenswrapper[4776]: E1208 09:20:04.626802 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zt6ld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-5697bb5779-897nd_openstack-operators(bb123983-a71d-4eca-84e8-6c116cc9b3b6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:20:05 crc kubenswrapper[4776]: E1208 09:20:05.876108 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a" Dec 08 09:20:05 crc kubenswrapper[4776]: E1208 09:20:05.876329 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jp28j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-697fb699cf-ftb4x_openstack-operators(316c9728-ccef-4981-9903-895ab86e6616): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:20:07 crc kubenswrapper[4776]: E1208 09:20:07.615655 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 08 09:20:07 crc kubenswrapper[4776]: E1208 09:20:07.615863 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ffgsq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-4k8qf_openstack-operators(beadb3ee-3cd9-4c83-ba1f-9f599cd24940): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:20:08 crc kubenswrapper[4776]: E1208 09:20:08.933829 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87" Dec 08 09:20:08 crc kubenswrapper[4776]: E1208 09:20:08.934815 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zvcjs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-967d97867-4dj2x_openstack-operators(422088d1-15c7-4791-b0c9-a12a2c5e2880): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:20:10 crc kubenswrapper[4776]: E1208 09:20:10.287589 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8" Dec 08 09:20:10 crc kubenswrapper[4776]: E1208 09:20:10.288044 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kvlbb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-667bd8d554-kfz2m_openstack-operators(61424c2d-bdc7-431a-8f12-535e1e97ce4b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:20:10 crc kubenswrapper[4776]: E1208 09:20:10.791022 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 08 09:20:10 crc kubenswrapper[4776]: E1208 09:20:10.791238 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tqnhr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-mdm5f_openstack-operators(482e5641-8a00-4fc3-b7d3-6eb88dbee1e4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:20:12 crc kubenswrapper[4776]: E1208 09:20:12.721127 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 08 09:20:12 crc kubenswrapper[4776]: E1208 09:20:12.721590 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zs8rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-dqgnv_openstack-operators(288a9127-92ed-4b19-8cc5-34b1f9b51201): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:20:13 crc kubenswrapper[4776]: E1208 09:20:13.155192 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 08 09:20:13 crc kubenswrapper[4776]: E1208 09:20:13.155448 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vdn87,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-gk9xw_openstack-operators(8cd2dc5d-1433-4660-9d65-bf49d398415f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:20:13 crc kubenswrapper[4776]: E1208 09:20:13.643859 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 08 09:20:13 crc kubenswrapper[4776]: E1208 09:20:13.644038 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xnwqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-l979f_openstack-operators(d07c95ca-1871-4ba3-81e5-c7b4d86bb0f4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:20:14 crc kubenswrapper[4776]: I1208 09:20:14.930918 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert\") pod \"infra-operator-controller-manager-78d48bff9d-87pfw\" (UID: \"dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" Dec 08 09:20:14 crc kubenswrapper[4776]: I1208 09:20:14.939880 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0-cert\") pod \"infra-operator-controller-manager-78d48bff9d-87pfw\" (UID: \"dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" Dec 08 09:20:15 crc kubenswrapper[4776]: I1208 09:20:15.147618 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5tn42" Dec 08 09:20:15 crc kubenswrapper[4776]: I1208 09:20:15.155470 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" Dec 08 09:20:15 crc kubenswrapper[4776]: I1208 09:20:15.337282 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f7smn5\" (UID: \"0cb0505b-eb0f-4801-841d-8a96fe29e608\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" Dec 08 09:20:15 crc kubenswrapper[4776]: I1208 09:20:15.348948 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cb0505b-eb0f-4801-841d-8a96fe29e608-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f7smn5\" (UID: \"0cb0505b-eb0f-4801-841d-8a96fe29e608\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" Dec 08 09:20:15 crc kubenswrapper[4776]: I1208 09:20:15.489607 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-fbxmv" Dec 08 09:20:15 crc kubenswrapper[4776]: I1208 09:20:15.495956 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" Dec 08 09:20:15 crc kubenswrapper[4776]: I1208 09:20:15.744577 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:20:15 crc kubenswrapper[4776]: I1208 09:20:15.744719 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:20:15 crc kubenswrapper[4776]: I1208 09:20:15.749239 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-webhook-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:20:15 crc kubenswrapper[4776]: I1208 09:20:15.750826 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23-metrics-certs\") pod \"openstack-operator-controller-manager-57686cd5df-zt7pj\" (UID: \"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23\") " pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:20:15 crc kubenswrapper[4776]: I1208 09:20:15.905465 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-2jw97" Dec 08 09:20:15 crc kubenswrapper[4776]: I1208 09:20:15.913297 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:20:17 crc kubenswrapper[4776]: E1208 09:20:17.025161 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 08 09:20:17 crc kubenswrapper[4776]: E1208 09:20:17.025430 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-52k74,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-f2bnk_openstack-operators(f85d592d-d82d-4c08-aafb-e9a7e68ef386): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:20:17 crc kubenswrapper[4776]: E1208 09:20:17.194779 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.251:5001/openstack-k8s-operators/telemetry-operator:d3ea47b1122f22fdda4bc30dd95b8db90981973f" Dec 08 09:20:17 crc kubenswrapper[4776]: E1208 09:20:17.195424 4776 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.251:5001/openstack-k8s-operators/telemetry-operator:d3ea47b1122f22fdda4bc30dd95b8db90981973f" Dec 08 09:20:17 crc kubenswrapper[4776]: E1208 09:20:17.195578 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.251:5001/openstack-k8s-operators/telemetry-operator:d3ea47b1122f22fdda4bc30dd95b8db90981973f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-krpk2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-68f9cdc5f7-scgrq_openstack-operators(7134ec23-7ec3-454d-b837-29fbe7094067): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:20:18 crc kubenswrapper[4776]: E1208 09:20:18.745100 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 08 09:20:18 crc kubenswrapper[4776]: E1208 09:20:18.745351 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-slvdw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-7smkr_openstack-operators(0f590af7-17bd-46c4-8a25-ba3a368c6382): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:20:23 crc kubenswrapper[4776]: E1208 09:20:23.511628 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 08 09:20:23 crc kubenswrapper[4776]: E1208 09:20:23.512372 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dmb2k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-xxv7g_openstack-operators(d8a1143b-5dc6-4a99-a6e4-f155585ebbcb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:20:23 crc kubenswrapper[4776]: E1208 09:20:23.513570 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xxv7g" podUID="d8a1143b-5dc6-4a99-a6e4-f155585ebbcb" Dec 08 09:20:23 crc kubenswrapper[4776]: E1208 09:20:23.576611 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xxv7g" podUID="d8a1143b-5dc6-4a99-a6e4-f155585ebbcb" Dec 08 09:20:24 crc kubenswrapper[4776]: I1208 09:20:24.362047 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw"] Dec 08 09:20:24 crc kubenswrapper[4776]: I1208 09:20:24.468255 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj"] Dec 08 09:20:24 crc kubenswrapper[4776]: I1208 09:20:24.484480 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5"] Dec 08 09:20:24 crc kubenswrapper[4776]: W1208 09:20:24.572265 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7f1ff45_22cc_41bd_a0a3_0b5ed3d66a23.slice/crio-a0e542e8a7a5156a7bed0d590b312f086753500872f48890e357c16f250e3d75 WatchSource:0}: Error finding container a0e542e8a7a5156a7bed0d590b312f086753500872f48890e357c16f250e3d75: Status 404 returned error can't find the container with id a0e542e8a7a5156a7bed0d590b312f086753500872f48890e357c16f250e3d75 Dec 08 09:20:24 crc kubenswrapper[4776]: I1208 09:20:24.609409 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jgmdb" event={"ID":"2a4ffe83-5f4d-4a7a-a2b6-64d12bd8f3f9","Type":"ContainerStarted","Data":"18efbad7a2968ced5ffa12a9d2f3805b3d487c19138caeab75e34109a6ace485"} Dec 08 09:20:24 crc kubenswrapper[4776]: I1208 09:20:24.610358 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" event={"ID":"0cb0505b-eb0f-4801-841d-8a96fe29e608","Type":"ContainerStarted","Data":"93424fae4e5b9f658c23922ac3df5738dca0adf25349621cf65dc01c7b9a7a0f"} Dec 08 09:20:24 crc kubenswrapper[4776]: I1208 09:20:24.613586 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-k928c" event={"ID":"545c7a23-3539-4923-bd9e-8d64700070b5","Type":"ContainerStarted","Data":"5cf18dcc477881831667fccaaf997e196cd88a10886a84d65a5cda956bc69745"} Dec 08 09:20:24 crc kubenswrapper[4776]: I1208 09:20:24.615321 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" event={"ID":"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23","Type":"ContainerStarted","Data":"a0e542e8a7a5156a7bed0d590b312f086753500872f48890e357c16f250e3d75"} Dec 08 09:20:24 crc kubenswrapper[4776]: I1208 09:20:24.617347 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" event={"ID":"dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0","Type":"ContainerStarted","Data":"949f88cf0039cdac77948235bf15f7b1c8f8f7374b32461243148ef874a4e600"} Dec 08 09:20:25 crc kubenswrapper[4776]: I1208 09:20:25.670352 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-2g4ph" event={"ID":"ad1d3b70-6eea-46a4-bdc1-82144fe12f4a","Type":"ContainerStarted","Data":"02096ca78ac9928768c9ac0be7903f0ccaa62204bfa22c637a02a7030b7bb7ed"} Dec 08 09:20:25 crc kubenswrapper[4776]: I1208 09:20:25.697241 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf" event={"ID":"6ea3ffdd-a922-487e-a738-da3091a1656e","Type":"ContainerStarted","Data":"7d3182e569b1965511ee0606fde54b374ace3b092399a96ef7f48aa9a204b438"} Dec 08 09:20:25 crc kubenswrapper[4776]: I1208 09:20:25.713760 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgfzz" event={"ID":"b39e8644-6fb7-4d7c-a623-c0eadac0e896","Type":"ContainerStarted","Data":"83d4e2e0eb873f2352bb7aec1dfabfe21a04e6904c3f79ebf48036fd7eea2c92"} Dec 08 09:20:25 crc kubenswrapper[4776]: I1208 09:20:25.732896 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-g66m2" event={"ID":"ff110975-7e1d-4d6d-bd10-b666cd8fe98b","Type":"ContainerStarted","Data":"a9653fca2c71b56a15ad3868d2405daa65b9f2fbf2cdadf6a7edc0f9eee7aae4"} Dec 08 09:20:27 crc kubenswrapper[4776]: I1208 09:20:27.750375 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f" event={"ID":"c8f3f832-68f1-47a2-bb3d-5d67f54655ce","Type":"ContainerStarted","Data":"85d74ccdd67daa84500eb936137f0270ef71fd8730dda16cc11af5cb5536eef5"} Dec 08 09:20:29 crc kubenswrapper[4776]: I1208 09:20:29.797829 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" event={"ID":"e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23","Type":"ContainerStarted","Data":"000ce0b097a1532f89a525ac9e3b8addf114953fdea6e6253eb30884ba87b82e"} Dec 08 09:20:29 crc kubenswrapper[4776]: I1208 09:20:29.799343 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:20:29 crc kubenswrapper[4776]: I1208 09:20:29.851946 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" podStartSLOduration=46.851924837 podStartE2EDuration="46.851924837s" podCreationTimestamp="2025-12-08 09:19:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:20:29.842820812 +0000 UTC m=+1306.106045834" watchObservedRunningTime="2025-12-08 09:20:29.851924837 +0000 UTC m=+1306.115149859" Dec 08 09:20:30 crc kubenswrapper[4776]: E1208 09:20:30.013806 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mdm5f" podUID="482e5641-8a00-4fc3-b7d3-6eb88dbee1e4" Dec 08 09:20:30 crc kubenswrapper[4776]: E1208 09:20:30.039340 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kfz2m" podUID="61424c2d-bdc7-431a-8f12-535e1e97ce4b" Dec 08 09:20:30 crc kubenswrapper[4776]: E1208 09:20:30.183606 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-967d97867-4dj2x" podUID="422088d1-15c7-4791-b0c9-a12a2c5e2880" Dec 08 09:20:30 crc kubenswrapper[4776]: E1208 09:20:30.276750 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dqgnv" podUID="288a9127-92ed-4b19-8cc5-34b1f9b51201" Dec 08 09:20:30 crc kubenswrapper[4776]: E1208 09:20:30.384632 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-897nd" podUID="bb123983-a71d-4eca-84e8-6c116cc9b3b6" Dec 08 09:20:30 crc kubenswrapper[4776]: E1208 09:20:30.545043 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-68f9cdc5f7-scgrq" podUID="7134ec23-7ec3-454d-b837-29fbe7094067" Dec 08 09:20:30 crc kubenswrapper[4776]: E1208 09:20:30.555460 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gk9xw" podUID="8cd2dc5d-1433-4660-9d65-bf49d398415f" Dec 08 09:20:30 crc kubenswrapper[4776]: E1208 09:20:30.585597 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7smkr" podUID="0f590af7-17bd-46c4-8a25-ba3a368c6382" Dec 08 09:20:30 crc kubenswrapper[4776]: E1208 09:20:30.699925 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-f2bnk" podUID="f85d592d-d82d-4c08-aafb-e9a7e68ef386" Dec 08 09:20:30 crc kubenswrapper[4776]: I1208 09:20:30.862367 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgfzz" event={"ID":"b39e8644-6fb7-4d7c-a623-c0eadac0e896","Type":"ContainerStarted","Data":"f28846c22b60bc696c2517e936b648bd5a99603aa38d0841881e734f2e36ab63"} Dec 08 09:20:30 crc kubenswrapper[4776]: I1208 09:20:30.865224 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgfzz" Dec 08 09:20:30 crc kubenswrapper[4776]: I1208 09:20:30.871561 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgfzz" Dec 08 09:20:30 crc kubenswrapper[4776]: I1208 09:20:30.890423 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" event={"ID":"0cb0505b-eb0f-4801-841d-8a96fe29e608","Type":"ContainerStarted","Data":"e8eaf98df3f156aa52077447864f77af3691337a30ae11812038779d527d9209"} Dec 08 09:20:30 crc kubenswrapper[4776]: I1208 09:20:30.930896 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgfzz" podStartSLOduration=3.479594178 podStartE2EDuration="48.930881587s" podCreationTimestamp="2025-12-08 09:19:42 +0000 UTC" firstStartedPulling="2025-12-08 09:19:43.930445498 +0000 UTC m=+1260.193670520" lastFinishedPulling="2025-12-08 09:20:29.381732907 +0000 UTC m=+1305.644957929" observedRunningTime="2025-12-08 09:20:30.898496845 +0000 UTC m=+1307.161721867" watchObservedRunningTime="2025-12-08 09:20:30.930881587 +0000 UTC m=+1307.194106599" Dec 08 09:20:30 crc kubenswrapper[4776]: I1208 09:20:30.931459 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7smkr" event={"ID":"0f590af7-17bd-46c4-8a25-ba3a368c6382","Type":"ContainerStarted","Data":"4f7d7e92f318faef91e6e01a9f5cf2a92ae467a6faf5462d95e1b85698b9c558"} Dec 08 09:20:30 crc kubenswrapper[4776]: I1208 09:20:30.974523 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kfz2m" event={"ID":"61424c2d-bdc7-431a-8f12-535e1e97ce4b","Type":"ContainerStarted","Data":"38f1ebfadb1fad9a1c8a1154f275bbbf16b96e314e763ab808d2c5eb4c7c4784"} Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.010268 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf" event={"ID":"6ea3ffdd-a922-487e-a738-da3091a1656e","Type":"ContainerStarted","Data":"9a874bc43db66b74f923962fab296d8bd27257b72e4483104fa6229196076129"} Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.010742 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf" Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.020981 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-2g4ph" event={"ID":"ad1d3b70-6eea-46a4-bdc1-82144fe12f4a","Type":"ContainerStarted","Data":"f00923c8c2a52174e26afe88a8ebe62909ee00f9502ecf521fe3696f3a697b44"} Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.021228 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-2g4ph" Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.024289 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf" Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.024506 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-2g4ph" Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.029202 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dqgnv" event={"ID":"288a9127-92ed-4b19-8cc5-34b1f9b51201","Type":"ContainerStarted","Data":"2c3e1352fb2c86947b77b600b667e8b156577346a385b00d240e5ca513ba2376"} Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.047514 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gk9xw" event={"ID":"8cd2dc5d-1433-4660-9d65-bf49d398415f","Type":"ContainerStarted","Data":"6183dcbe440f2c8859b53b216ff0b83e8cdbe49fc49609c0a9ff74efca18bb5d"} Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.056701 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-ncfrf" podStartSLOduration=4.204477857 podStartE2EDuration="48.056684172s" podCreationTimestamp="2025-12-08 09:19:43 +0000 UTC" firstStartedPulling="2025-12-08 09:19:45.560491845 +0000 UTC m=+1261.823716867" lastFinishedPulling="2025-12-08 09:20:29.41269815 +0000 UTC m=+1305.675923182" observedRunningTime="2025-12-08 09:20:31.043200139 +0000 UTC m=+1307.306425161" watchObservedRunningTime="2025-12-08 09:20:31.056684172 +0000 UTC m=+1307.319909194" Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.070542 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-68f9cdc5f7-scgrq" event={"ID":"7134ec23-7ec3-454d-b837-29fbe7094067","Type":"ContainerStarted","Data":"b950bb217f3499d17388b99168d6bf642d67bf749f3e9066bfcc1f1d020d904b"} Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.099947 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-4dj2x" event={"ID":"422088d1-15c7-4791-b0c9-a12a2c5e2880","Type":"ContainerStarted","Data":"541bd31ea540eb46b5a0c7998425dd6eec70d7242ccecf2d0b3a192e9095c229"} Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.113422 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mdm5f" event={"ID":"482e5641-8a00-4fc3-b7d3-6eb88dbee1e4","Type":"ContainerStarted","Data":"30f22cb699cf1f1335c2d1e85919168b5ff8e6ed9bb815709670312f7d506f8d"} Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.122605 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-2g4ph" podStartSLOduration=3.741598068 podStartE2EDuration="49.122584315s" podCreationTimestamp="2025-12-08 09:19:42 +0000 UTC" firstStartedPulling="2025-12-08 09:19:44.023693017 +0000 UTC m=+1260.286918039" lastFinishedPulling="2025-12-08 09:20:29.404679264 +0000 UTC m=+1305.667904286" observedRunningTime="2025-12-08 09:20:31.117186689 +0000 UTC m=+1307.380411711" watchObservedRunningTime="2025-12-08 09:20:31.122584315 +0000 UTC m=+1307.385809337" Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.124362 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-f2bnk" event={"ID":"f85d592d-d82d-4c08-aafb-e9a7e68ef386","Type":"ContainerStarted","Data":"f0be8722d6f7678c9e87d7141dc105d94a3c33cb86646c2e4bb7de2cde1e6cdc"} Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.143576 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" event={"ID":"dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0","Type":"ContainerStarted","Data":"8022a852216cbfc46c99917d8c7eb3b3a06a34d7a3b515f15db03de1c9b48e8d"} Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.175414 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-897nd" event={"ID":"bb123983-a71d-4eca-84e8-6c116cc9b3b6","Type":"ContainerStarted","Data":"cc10275e38134e5b35b41850b1b419835ca8af95a3465c81d6d7ea064d18558c"} Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.213396 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f" event={"ID":"c8f3f832-68f1-47a2-bb3d-5d67f54655ce","Type":"ContainerStarted","Data":"78651e9911d06ffbbba3dfadcd9ff884d728aeb68e911b8180dc677f257be137"} Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.214444 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f" Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.228230 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f" Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.286725 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-k928c" event={"ID":"545c7a23-3539-4923-bd9e-8d64700070b5","Type":"ContainerStarted","Data":"2a54c98333b7890a795ec7185a6004eaea39de90461d941e04efdb9c88e6c11a"} Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.287116 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-k928c" Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.304803 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-k928c" Dec 08 09:20:31 crc kubenswrapper[4776]: E1208 09:20:31.353085 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4k8qf" podUID="beadb3ee-3cd9-4c83-ba1f-9f599cd24940" Dec 08 09:20:31 crc kubenswrapper[4776]: E1208 09:20:31.357042 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l979f" podUID="d07c95ca-1871-4ba3-81e5-c7b4d86bb0f4" Dec 08 09:20:31 crc kubenswrapper[4776]: E1208 09:20:31.400770 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-ftb4x" podUID="316c9728-ccef-4981-9903-895ab86e6616" Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.403744 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xtf2f" podStartSLOduration=10.121066605 podStartE2EDuration="48.403709999s" podCreationTimestamp="2025-12-08 09:19:43 +0000 UTC" firstStartedPulling="2025-12-08 09:19:45.261738206 +0000 UTC m=+1261.524963228" lastFinishedPulling="2025-12-08 09:20:23.5443816 +0000 UTC m=+1299.807606622" observedRunningTime="2025-12-08 09:20:31.398589381 +0000 UTC m=+1307.661814413" watchObservedRunningTime="2025-12-08 09:20:31.403709999 +0000 UTC m=+1307.666935021" Dec 08 09:20:31 crc kubenswrapper[4776]: I1208 09:20:31.433429 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-k928c" podStartSLOduration=4.9274437639999995 podStartE2EDuration="49.433398427s" podCreationTimestamp="2025-12-08 09:19:42 +0000 UTC" firstStartedPulling="2025-12-08 09:19:44.952652511 +0000 UTC m=+1261.215877533" lastFinishedPulling="2025-12-08 09:20:29.458607174 +0000 UTC m=+1305.721832196" observedRunningTime="2025-12-08 09:20:31.43164399 +0000 UTC m=+1307.694869012" watchObservedRunningTime="2025-12-08 09:20:31.433398427 +0000 UTC m=+1307.696623449" Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.321815 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l979f" event={"ID":"d07c95ca-1871-4ba3-81e5-c7b4d86bb0f4","Type":"ContainerStarted","Data":"48c37391b2fc026cffaedbb83ea11618ab0a73e7a6f649aa60a26c4fd84fbbea"} Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.327218 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jgmdb" event={"ID":"2a4ffe83-5f4d-4a7a-a2b6-64d12bd8f3f9","Type":"ContainerStarted","Data":"59ec1eeefa82333f0942fe850b69f2d196de034c8bbafc1bf8db979940520338"} Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.329378 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jgmdb" Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.331730 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jgmdb" Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.334061 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-68f9cdc5f7-scgrq" event={"ID":"7134ec23-7ec3-454d-b837-29fbe7094067","Type":"ContainerStarted","Data":"01bf3731d1fcf0b11535f40c4dd80abb20abe0afc09e1fc605a06f5de74d983d"} Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.334358 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-68f9cdc5f7-scgrq" Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.359615 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-g66m2" event={"ID":"ff110975-7e1d-4d6d-bd10-b666cd8fe98b","Type":"ContainerStarted","Data":"4c98700398dcb6c58fdc51061d5e386949bdfa98cd0fd3e084f5161254efc9bd"} Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.360045 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-g66m2" Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.360975 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-g66m2" Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.361121 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" event={"ID":"0cb0505b-eb0f-4801-841d-8a96fe29e608","Type":"ContainerStarted","Data":"9c4f933a61e8f157deb6c5bde1ac36798af78c33307c311950a84ef29cce0089"} Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.362043 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.371877 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-68f9cdc5f7-scgrq" podStartSLOduration=3.10629972 podStartE2EDuration="49.371855217s" podCreationTimestamp="2025-12-08 09:19:43 +0000 UTC" firstStartedPulling="2025-12-08 09:19:45.20983069 +0000 UTC m=+1261.473055712" lastFinishedPulling="2025-12-08 09:20:31.475386187 +0000 UTC m=+1307.738611209" observedRunningTime="2025-12-08 09:20:32.361150108 +0000 UTC m=+1308.624375120" watchObservedRunningTime="2025-12-08 09:20:32.371855217 +0000 UTC m=+1308.635080239" Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.378609 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4k8qf" event={"ID":"beadb3ee-3cd9-4c83-ba1f-9f599cd24940","Type":"ContainerStarted","Data":"8678e2668bf6888336d3b0c5aaa70a725b41c7b6e5218707a64b80bf263e1d0f"} Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.383720 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jgmdb" podStartSLOduration=5.8413918030000005 podStartE2EDuration="50.383703255s" podCreationTimestamp="2025-12-08 09:19:42 +0000 UTC" firstStartedPulling="2025-12-08 09:19:44.916816756 +0000 UTC m=+1261.180041778" lastFinishedPulling="2025-12-08 09:20:29.459128208 +0000 UTC m=+1305.722353230" observedRunningTime="2025-12-08 09:20:32.37863358 +0000 UTC m=+1308.641858602" watchObservedRunningTime="2025-12-08 09:20:32.383703255 +0000 UTC m=+1308.646928277" Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.397211 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-ftb4x" event={"ID":"316c9728-ccef-4981-9903-895ab86e6616","Type":"ContainerStarted","Data":"a353bf519edb7e81ffb73a74142232ff26ecd76f29d24b78e22db18eae46552b"} Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.415426 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" event={"ID":"dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0","Type":"ContainerStarted","Data":"677ec6ceb4991a0835bea48f17c47c3d91a3aa0432aa8bfc0fdf0278955263e6"} Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.428188 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" podStartSLOduration=45.696353721 podStartE2EDuration="50.428148131s" podCreationTimestamp="2025-12-08 09:19:42 +0000 UTC" firstStartedPulling="2025-12-08 09:20:24.5932276 +0000 UTC m=+1300.856452622" lastFinishedPulling="2025-12-08 09:20:29.32502201 +0000 UTC m=+1305.588247032" observedRunningTime="2025-12-08 09:20:32.422975312 +0000 UTC m=+1308.686200334" watchObservedRunningTime="2025-12-08 09:20:32.428148131 +0000 UTC m=+1308.691373153" Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.449772 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-g66m2" podStartSLOduration=5.9248801 podStartE2EDuration="50.449753503s" podCreationTimestamp="2025-12-08 09:19:42 +0000 UTC" firstStartedPulling="2025-12-08 09:19:44.916484448 +0000 UTC m=+1261.179709470" lastFinishedPulling="2025-12-08 09:20:29.441357861 +0000 UTC m=+1305.704582873" observedRunningTime="2025-12-08 09:20:32.442805425 +0000 UTC m=+1308.706030447" watchObservedRunningTime="2025-12-08 09:20:32.449753503 +0000 UTC m=+1308.712978525" Dec 08 09:20:32 crc kubenswrapper[4776]: I1208 09:20:32.529648 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" podStartSLOduration=45.733032237 podStartE2EDuration="50.529630071s" podCreationTimestamp="2025-12-08 09:19:42 +0000 UTC" firstStartedPulling="2025-12-08 09:20:24.432938917 +0000 UTC m=+1300.696163939" lastFinishedPulling="2025-12-08 09:20:29.229524811 +0000 UTC m=+1305.492761773" observedRunningTime="2025-12-08 09:20:32.469898644 +0000 UTC m=+1308.733123666" watchObservedRunningTime="2025-12-08 09:20:32.529630071 +0000 UTC m=+1308.792855093" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.422529 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-ftb4x" event={"ID":"316c9728-ccef-4981-9903-895ab86e6616","Type":"ContainerStarted","Data":"cb431d34ebfc2ae7be984974d8e0951c2736531ed487eec29a8e687ad1631f8a"} Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.423317 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-ftb4x" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.424899 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gk9xw" event={"ID":"8cd2dc5d-1433-4660-9d65-bf49d398415f","Type":"ContainerStarted","Data":"9532e683e47246eae9f15806f3d4cb6a8f831df06ae75ffeae3d3efeab3cba0d"} Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.425421 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gk9xw" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.427133 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l979f" event={"ID":"d07c95ca-1871-4ba3-81e5-c7b4d86bb0f4","Type":"ContainerStarted","Data":"e43ce321b91d9e2be2dc7e42bd3bde1b734a2058b0ace92a2eb980cffd989ba5"} Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.427303 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l979f" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.428813 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-4dj2x" event={"ID":"422088d1-15c7-4791-b0c9-a12a2c5e2880","Type":"ContainerStarted","Data":"4d74798718efb7baee3450488c5ba433c467bfe465a5a3066bbf6e6fda2759f6"} Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.429245 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-967d97867-4dj2x" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.431465 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mdm5f" event={"ID":"482e5641-8a00-4fc3-b7d3-6eb88dbee1e4","Type":"ContainerStarted","Data":"685d2a2d2c7246f25b53b37dfd9dcb464c0483fd3986561ef1e95ddd192fe568"} Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.431914 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mdm5f" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.438071 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-f2bnk" event={"ID":"f85d592d-d82d-4c08-aafb-e9a7e68ef386","Type":"ContainerStarted","Data":"7436a1ce419772258f3da3aabc697ccfd5460577233ba556625e30a6503061fc"} Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.438110 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-f2bnk" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.441234 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kfz2m" event={"ID":"61424c2d-bdc7-431a-8f12-535e1e97ce4b","Type":"ContainerStarted","Data":"1adba812599aa099ec1ab7c71d36c24235377becb41b0212d62f69629f8b308a"} Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.441722 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kfz2m" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.445791 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4k8qf" event={"ID":"beadb3ee-3cd9-4c83-ba1f-9f599cd24940","Type":"ContainerStarted","Data":"d79e5962bc6c34f44cb72223fb8525a2620b43893aca8e22c1468db0fbec7f19"} Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.446354 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4k8qf" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.448068 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dqgnv" event={"ID":"288a9127-92ed-4b19-8cc5-34b1f9b51201","Type":"ContainerStarted","Data":"8fdb9c7de402b223892fc403b22341beea408c747ce79dbf785a2e9d498b5634"} Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.448573 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dqgnv" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.469551 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-897nd" event={"ID":"bb123983-a71d-4eca-84e8-6c116cc9b3b6","Type":"ContainerStarted","Data":"b0e8d80d7812ed8ac4316ea7066049bf202c8f1994f5a80737c4374548fff2f0"} Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.469781 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-897nd" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.482652 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7smkr" event={"ID":"0f590af7-17bd-46c4-8a25-ba3a368c6382","Type":"ContainerStarted","Data":"da0e1a965db0f990ccf7e98e187dbf22267d9885340edee82febac724d766898"} Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.483188 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.492382 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7smkr" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.498212 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-ftb4x" podStartSLOduration=2.708608755 podStartE2EDuration="51.498194541s" podCreationTimestamp="2025-12-08 09:19:42 +0000 UTC" firstStartedPulling="2025-12-08 09:19:44.087135264 +0000 UTC m=+1260.350360286" lastFinishedPulling="2025-12-08 09:20:32.87672105 +0000 UTC m=+1309.139946072" observedRunningTime="2025-12-08 09:20:33.466226951 +0000 UTC m=+1309.729451983" watchObservedRunningTime="2025-12-08 09:20:33.498194541 +0000 UTC m=+1309.761419563" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.530821 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dqgnv" podStartSLOduration=4.536323541 podStartE2EDuration="51.530801219s" podCreationTimestamp="2025-12-08 09:19:42 +0000 UTC" firstStartedPulling="2025-12-08 09:19:44.904874825 +0000 UTC m=+1261.168099847" lastFinishedPulling="2025-12-08 09:20:31.899352503 +0000 UTC m=+1308.162577525" observedRunningTime="2025-12-08 09:20:33.503436772 +0000 UTC m=+1309.766661794" watchObservedRunningTime="2025-12-08 09:20:33.530801219 +0000 UTC m=+1309.794026241" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.547242 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mdm5f" podStartSLOduration=3.831317907 podStartE2EDuration="50.547226311s" podCreationTimestamp="2025-12-08 09:19:43 +0000 UTC" firstStartedPulling="2025-12-08 09:19:45.182552796 +0000 UTC m=+1261.445777818" lastFinishedPulling="2025-12-08 09:20:31.8984612 +0000 UTC m=+1308.161686222" observedRunningTime="2025-12-08 09:20:33.546294185 +0000 UTC m=+1309.809519207" watchObservedRunningTime="2025-12-08 09:20:33.547226311 +0000 UTC m=+1309.810451333" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.550265 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l979f" podStartSLOduration=4.040638732 podStartE2EDuration="51.550255672s" podCreationTimestamp="2025-12-08 09:19:42 +0000 UTC" firstStartedPulling="2025-12-08 09:19:45.216759846 +0000 UTC m=+1261.479984868" lastFinishedPulling="2025-12-08 09:20:32.726376786 +0000 UTC m=+1308.989601808" observedRunningTime="2025-12-08 09:20:33.528390164 +0000 UTC m=+1309.791615186" watchObservedRunningTime="2025-12-08 09:20:33.550255672 +0000 UTC m=+1309.813480694" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.619222 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4k8qf" podStartSLOduration=2.746953026 podStartE2EDuration="51.619207387s" podCreationTimestamp="2025-12-08 09:19:42 +0000 UTC" firstStartedPulling="2025-12-08 09:19:44.011628932 +0000 UTC m=+1260.274853954" lastFinishedPulling="2025-12-08 09:20:32.883883293 +0000 UTC m=+1309.147108315" observedRunningTime="2025-12-08 09:20:33.617962064 +0000 UTC m=+1309.881187086" watchObservedRunningTime="2025-12-08 09:20:33.619207387 +0000 UTC m=+1309.882432399" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.621585 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gk9xw" podStartSLOduration=3.956776163 podStartE2EDuration="50.621579701s" podCreationTimestamp="2025-12-08 09:19:43 +0000 UTC" firstStartedPulling="2025-12-08 09:19:45.206528252 +0000 UTC m=+1261.469753274" lastFinishedPulling="2025-12-08 09:20:31.87133179 +0000 UTC m=+1308.134556812" observedRunningTime="2025-12-08 09:20:33.591949964 +0000 UTC m=+1309.855174986" watchObservedRunningTime="2025-12-08 09:20:33.621579701 +0000 UTC m=+1309.884804723" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.645742 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-967d97867-4dj2x" podStartSLOduration=4.575612587 podStartE2EDuration="51.645723741s" podCreationTimestamp="2025-12-08 09:19:42 +0000 UTC" firstStartedPulling="2025-12-08 09:19:44.802598213 +0000 UTC m=+1261.065823235" lastFinishedPulling="2025-12-08 09:20:31.872709367 +0000 UTC m=+1308.135934389" observedRunningTime="2025-12-08 09:20:33.638852085 +0000 UTC m=+1309.902077107" watchObservedRunningTime="2025-12-08 09:20:33.645723741 +0000 UTC m=+1309.908948763" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.682578 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-f2bnk" podStartSLOduration=4.222902038 podStartE2EDuration="51.682560422s" podCreationTimestamp="2025-12-08 09:19:42 +0000 UTC" firstStartedPulling="2025-12-08 09:19:44.439672059 +0000 UTC m=+1260.702897081" lastFinishedPulling="2025-12-08 09:20:31.899330443 +0000 UTC m=+1308.162555465" observedRunningTime="2025-12-08 09:20:33.67913205 +0000 UTC m=+1309.942357072" watchObservedRunningTime="2025-12-08 09:20:33.682560422 +0000 UTC m=+1309.945785444" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.709656 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kfz2m" podStartSLOduration=4.318933147 podStartE2EDuration="50.709639911s" podCreationTimestamp="2025-12-08 09:19:43 +0000 UTC" firstStartedPulling="2025-12-08 09:19:45.507632533 +0000 UTC m=+1261.770857555" lastFinishedPulling="2025-12-08 09:20:31.898339297 +0000 UTC m=+1308.161564319" observedRunningTime="2025-12-08 09:20:33.703297299 +0000 UTC m=+1309.966522321" watchObservedRunningTime="2025-12-08 09:20:33.709639911 +0000 UTC m=+1309.972864923" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.737303 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7smkr" podStartSLOduration=4.830375893 podStartE2EDuration="51.737288615s" podCreationTimestamp="2025-12-08 09:19:42 +0000 UTC" firstStartedPulling="2025-12-08 09:19:44.966441262 +0000 UTC m=+1261.229666284" lastFinishedPulling="2025-12-08 09:20:31.873353994 +0000 UTC m=+1308.136579006" observedRunningTime="2025-12-08 09:20:33.735335002 +0000 UTC m=+1309.998560034" watchObservedRunningTime="2025-12-08 09:20:33.737288615 +0000 UTC m=+1310.000513637" Dec 08 09:20:33 crc kubenswrapper[4776]: I1208 09:20:33.791247 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-897nd" podStartSLOduration=4.325308905 podStartE2EDuration="51.791227406s" podCreationTimestamp="2025-12-08 09:19:42 +0000 UTC" firstStartedPulling="2025-12-08 09:19:44.430453382 +0000 UTC m=+1260.693678404" lastFinishedPulling="2025-12-08 09:20:31.896371883 +0000 UTC m=+1308.159596905" observedRunningTime="2025-12-08 09:20:33.762887773 +0000 UTC m=+1310.026112805" watchObservedRunningTime="2025-12-08 09:20:33.791227406 +0000 UTC m=+1310.054452428" Dec 08 09:20:35 crc kubenswrapper[4776]: I1208 09:20:35.165831 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-87pfw" Dec 08 09:20:35 crc kubenswrapper[4776]: I1208 09:20:35.506773 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f7smn5" Dec 08 09:20:35 crc kubenswrapper[4776]: I1208 09:20:35.922965 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-57686cd5df-zt7pj" Dec 08 09:20:37 crc kubenswrapper[4776]: I1208 09:20:37.533923 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xxv7g" event={"ID":"d8a1143b-5dc6-4a99-a6e4-f155585ebbcb","Type":"ContainerStarted","Data":"f152d712b40654c6d5def168d3b037f1879ebd43cdda665756c4e7a6e8163945"} Dec 08 09:20:37 crc kubenswrapper[4776]: I1208 09:20:37.556940 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xxv7g" podStartSLOduration=3.338446477 podStartE2EDuration="54.556919543s" podCreationTimestamp="2025-12-08 09:19:43 +0000 UTC" firstStartedPulling="2025-12-08 09:19:45.542965704 +0000 UTC m=+1261.806190726" lastFinishedPulling="2025-12-08 09:20:36.76143877 +0000 UTC m=+1313.024663792" observedRunningTime="2025-12-08 09:20:37.551817225 +0000 UTC m=+1313.815042237" watchObservedRunningTime="2025-12-08 09:20:37.556919543 +0000 UTC m=+1313.820144565" Dec 08 09:20:41 crc kubenswrapper[4776]: I1208 09:20:41.398667 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:20:41 crc kubenswrapper[4776]: I1208 09:20:41.400003 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:20:43 crc kubenswrapper[4776]: I1208 09:20:43.092586 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-ftb4x" Dec 08 09:20:43 crc kubenswrapper[4776]: I1208 09:20:43.137359 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-897nd" Dec 08 09:20:43 crc kubenswrapper[4776]: I1208 09:20:43.184998 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-f2bnk" Dec 08 09:20:43 crc kubenswrapper[4776]: I1208 09:20:43.204378 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4k8qf" Dec 08 09:20:43 crc kubenswrapper[4776]: I1208 09:20:43.458704 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-967d97867-4dj2x" Dec 08 09:20:43 crc kubenswrapper[4776]: I1208 09:20:43.494366 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7smkr" Dec 08 09:20:43 crc kubenswrapper[4776]: I1208 09:20:43.585872 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dqgnv" Dec 08 09:20:43 crc kubenswrapper[4776]: I1208 09:20:43.656108 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l979f" Dec 08 09:20:43 crc kubenswrapper[4776]: I1208 09:20:43.800802 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gk9xw" Dec 08 09:20:43 crc kubenswrapper[4776]: I1208 09:20:43.839362 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mdm5f" Dec 08 09:20:43 crc kubenswrapper[4776]: I1208 09:20:43.892731 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-68f9cdc5f7-scgrq" Dec 08 09:20:44 crc kubenswrapper[4776]: I1208 09:20:44.084860 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kfz2m" Dec 08 09:20:59 crc kubenswrapper[4776]: I1208 09:20:59.966459 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dpq7z"] Dec 08 09:20:59 crc kubenswrapper[4776]: I1208 09:20:59.969612 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dpq7z" Dec 08 09:20:59 crc kubenswrapper[4776]: I1208 09:20:59.971981 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 08 09:20:59 crc kubenswrapper[4776]: I1208 09:20:59.972280 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 08 09:20:59 crc kubenswrapper[4776]: I1208 09:20:59.976436 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 08 09:20:59 crc kubenswrapper[4776]: I1208 09:20:59.977154 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-shx86" Dec 08 09:20:59 crc kubenswrapper[4776]: I1208 09:20:59.984540 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dpq7z"] Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.023608 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vwtnw"] Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.025123 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vwtnw" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.032416 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.036931 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vwtnw"] Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.099891 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8k97\" (UniqueName: \"kubernetes.io/projected/d119edfa-fe1a-4db4-bcd9-370adc8e9d0a-kube-api-access-t8k97\") pod \"dnsmasq-dns-675f4bcbfc-dpq7z\" (UID: \"d119edfa-fe1a-4db4-bcd9-370adc8e9d0a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dpq7z" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.099935 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktb54\" (UniqueName: \"kubernetes.io/projected/998af490-d762-4ddf-a8ae-be4b9a3da989-kube-api-access-ktb54\") pod \"dnsmasq-dns-78dd6ddcc-vwtnw\" (UID: \"998af490-d762-4ddf-a8ae-be4b9a3da989\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vwtnw" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.099978 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d119edfa-fe1a-4db4-bcd9-370adc8e9d0a-config\") pod \"dnsmasq-dns-675f4bcbfc-dpq7z\" (UID: \"d119edfa-fe1a-4db4-bcd9-370adc8e9d0a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dpq7z" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.100003 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/998af490-d762-4ddf-a8ae-be4b9a3da989-config\") pod \"dnsmasq-dns-78dd6ddcc-vwtnw\" (UID: \"998af490-d762-4ddf-a8ae-be4b9a3da989\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vwtnw" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.100127 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/998af490-d762-4ddf-a8ae-be4b9a3da989-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vwtnw\" (UID: \"998af490-d762-4ddf-a8ae-be4b9a3da989\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vwtnw" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.201652 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8k97\" (UniqueName: \"kubernetes.io/projected/d119edfa-fe1a-4db4-bcd9-370adc8e9d0a-kube-api-access-t8k97\") pod \"dnsmasq-dns-675f4bcbfc-dpq7z\" (UID: \"d119edfa-fe1a-4db4-bcd9-370adc8e9d0a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dpq7z" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.201694 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktb54\" (UniqueName: \"kubernetes.io/projected/998af490-d762-4ddf-a8ae-be4b9a3da989-kube-api-access-ktb54\") pod \"dnsmasq-dns-78dd6ddcc-vwtnw\" (UID: \"998af490-d762-4ddf-a8ae-be4b9a3da989\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vwtnw" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.201731 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d119edfa-fe1a-4db4-bcd9-370adc8e9d0a-config\") pod \"dnsmasq-dns-675f4bcbfc-dpq7z\" (UID: \"d119edfa-fe1a-4db4-bcd9-370adc8e9d0a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dpq7z" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.201750 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/998af490-d762-4ddf-a8ae-be4b9a3da989-config\") pod \"dnsmasq-dns-78dd6ddcc-vwtnw\" (UID: \"998af490-d762-4ddf-a8ae-be4b9a3da989\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vwtnw" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.201794 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/998af490-d762-4ddf-a8ae-be4b9a3da989-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vwtnw\" (UID: \"998af490-d762-4ddf-a8ae-be4b9a3da989\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vwtnw" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.202736 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/998af490-d762-4ddf-a8ae-be4b9a3da989-config\") pod \"dnsmasq-dns-78dd6ddcc-vwtnw\" (UID: \"998af490-d762-4ddf-a8ae-be4b9a3da989\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vwtnw" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.202747 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/998af490-d762-4ddf-a8ae-be4b9a3da989-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vwtnw\" (UID: \"998af490-d762-4ddf-a8ae-be4b9a3da989\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vwtnw" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.202801 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d119edfa-fe1a-4db4-bcd9-370adc8e9d0a-config\") pod \"dnsmasq-dns-675f4bcbfc-dpq7z\" (UID: \"d119edfa-fe1a-4db4-bcd9-370adc8e9d0a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dpq7z" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.220911 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8k97\" (UniqueName: \"kubernetes.io/projected/d119edfa-fe1a-4db4-bcd9-370adc8e9d0a-kube-api-access-t8k97\") pod \"dnsmasq-dns-675f4bcbfc-dpq7z\" (UID: \"d119edfa-fe1a-4db4-bcd9-370adc8e9d0a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dpq7z" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.221022 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktb54\" (UniqueName: \"kubernetes.io/projected/998af490-d762-4ddf-a8ae-be4b9a3da989-kube-api-access-ktb54\") pod \"dnsmasq-dns-78dd6ddcc-vwtnw\" (UID: \"998af490-d762-4ddf-a8ae-be4b9a3da989\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vwtnw" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.289145 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dpq7z" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.347260 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vwtnw" Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.750834 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dpq7z"] Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.791425 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dpq7z" event={"ID":"d119edfa-fe1a-4db4-bcd9-370adc8e9d0a","Type":"ContainerStarted","Data":"70273d65493b8fe92bec6aea20839d73faaef35ffa87fe167c4cda69185f8527"} Dec 08 09:21:00 crc kubenswrapper[4776]: I1208 09:21:00.863140 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vwtnw"] Dec 08 09:21:00 crc kubenswrapper[4776]: W1208 09:21:00.866545 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod998af490_d762_4ddf_a8ae_be4b9a3da989.slice/crio-ed276c9e9dd4877b4a99ec95924cb8c795201be0db44717568e2e5d11c5d9ab7 WatchSource:0}: Error finding container ed276c9e9dd4877b4a99ec95924cb8c795201be0db44717568e2e5d11c5d9ab7: Status 404 returned error can't find the container with id ed276c9e9dd4877b4a99ec95924cb8c795201be0db44717568e2e5d11c5d9ab7 Dec 08 09:21:01 crc kubenswrapper[4776]: I1208 09:21:01.803956 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-vwtnw" event={"ID":"998af490-d762-4ddf-a8ae-be4b9a3da989","Type":"ContainerStarted","Data":"ed276c9e9dd4877b4a99ec95924cb8c795201be0db44717568e2e5d11c5d9ab7"} Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.057419 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dpq7z"] Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.071706 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-n4jfg"] Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.073081 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.107517 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-n4jfg"] Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.166640 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408d75b6-d5bf-4dfe-8a22-7ba887229cac-config\") pod \"dnsmasq-dns-666b6646f7-n4jfg\" (UID: \"408d75b6-d5bf-4dfe-8a22-7ba887229cac\") " pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.166762 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnxbm\" (UniqueName: \"kubernetes.io/projected/408d75b6-d5bf-4dfe-8a22-7ba887229cac-kube-api-access-nnxbm\") pod \"dnsmasq-dns-666b6646f7-n4jfg\" (UID: \"408d75b6-d5bf-4dfe-8a22-7ba887229cac\") " pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.166824 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/408d75b6-d5bf-4dfe-8a22-7ba887229cac-dns-svc\") pod \"dnsmasq-dns-666b6646f7-n4jfg\" (UID: \"408d75b6-d5bf-4dfe-8a22-7ba887229cac\") " pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.271130 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnxbm\" (UniqueName: \"kubernetes.io/projected/408d75b6-d5bf-4dfe-8a22-7ba887229cac-kube-api-access-nnxbm\") pod \"dnsmasq-dns-666b6646f7-n4jfg\" (UID: \"408d75b6-d5bf-4dfe-8a22-7ba887229cac\") " pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.271235 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/408d75b6-d5bf-4dfe-8a22-7ba887229cac-dns-svc\") pod \"dnsmasq-dns-666b6646f7-n4jfg\" (UID: \"408d75b6-d5bf-4dfe-8a22-7ba887229cac\") " pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.274795 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/408d75b6-d5bf-4dfe-8a22-7ba887229cac-dns-svc\") pod \"dnsmasq-dns-666b6646f7-n4jfg\" (UID: \"408d75b6-d5bf-4dfe-8a22-7ba887229cac\") " pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.276554 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408d75b6-d5bf-4dfe-8a22-7ba887229cac-config\") pod \"dnsmasq-dns-666b6646f7-n4jfg\" (UID: \"408d75b6-d5bf-4dfe-8a22-7ba887229cac\") " pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.277419 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408d75b6-d5bf-4dfe-8a22-7ba887229cac-config\") pod \"dnsmasq-dns-666b6646f7-n4jfg\" (UID: \"408d75b6-d5bf-4dfe-8a22-7ba887229cac\") " pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.301242 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnxbm\" (UniqueName: \"kubernetes.io/projected/408d75b6-d5bf-4dfe-8a22-7ba887229cac-kube-api-access-nnxbm\") pod \"dnsmasq-dns-666b6646f7-n4jfg\" (UID: \"408d75b6-d5bf-4dfe-8a22-7ba887229cac\") " pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.348844 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vwtnw"] Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.396501 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9d9dp"] Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.398117 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.401253 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9d9dp"] Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.428490 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.480011 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/639910a7-1d35-4535-b629-18fe52dacac3-config\") pod \"dnsmasq-dns-57d769cc4f-9d9dp\" (UID: \"639910a7-1d35-4535-b629-18fe52dacac3\") " pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.480053 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/639910a7-1d35-4535-b629-18fe52dacac3-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9d9dp\" (UID: \"639910a7-1d35-4535-b629-18fe52dacac3\") " pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.480089 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhpzr\" (UniqueName: \"kubernetes.io/projected/639910a7-1d35-4535-b629-18fe52dacac3-kube-api-access-jhpzr\") pod \"dnsmasq-dns-57d769cc4f-9d9dp\" (UID: \"639910a7-1d35-4535-b629-18fe52dacac3\") " pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.590573 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/639910a7-1d35-4535-b629-18fe52dacac3-config\") pod \"dnsmasq-dns-57d769cc4f-9d9dp\" (UID: \"639910a7-1d35-4535-b629-18fe52dacac3\") " pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.590630 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/639910a7-1d35-4535-b629-18fe52dacac3-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9d9dp\" (UID: \"639910a7-1d35-4535-b629-18fe52dacac3\") " pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.590697 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhpzr\" (UniqueName: \"kubernetes.io/projected/639910a7-1d35-4535-b629-18fe52dacac3-kube-api-access-jhpzr\") pod \"dnsmasq-dns-57d769cc4f-9d9dp\" (UID: \"639910a7-1d35-4535-b629-18fe52dacac3\") " pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.595100 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/639910a7-1d35-4535-b629-18fe52dacac3-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9d9dp\" (UID: \"639910a7-1d35-4535-b629-18fe52dacac3\") " pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.595898 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/639910a7-1d35-4535-b629-18fe52dacac3-config\") pod \"dnsmasq-dns-57d769cc4f-9d9dp\" (UID: \"639910a7-1d35-4535-b629-18fe52dacac3\") " pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.615154 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhpzr\" (UniqueName: \"kubernetes.io/projected/639910a7-1d35-4535-b629-18fe52dacac3-kube-api-access-jhpzr\") pod \"dnsmasq-dns-57d769cc4f-9d9dp\" (UID: \"639910a7-1d35-4535-b629-18fe52dacac3\") " pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.740605 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" Dec 08 09:21:03 crc kubenswrapper[4776]: I1208 09:21:03.956117 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-n4jfg"] Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.194269 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.195912 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.203834 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.203989 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.204186 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.204438 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.204469 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.204566 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.204684 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-cmx8g" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.205229 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.308605 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-config-data\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.309103 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.309137 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.309204 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.309265 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.309288 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.309316 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.309346 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.309393 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.309424 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t66m\" (UniqueName: \"kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-kube-api-access-7t66m\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.309457 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.326834 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9d9dp"] Dec 08 09:21:04 crc kubenswrapper[4776]: W1208 09:21:04.338720 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod639910a7_1d35_4535_b629_18fe52dacac3.slice/crio-f5e04ad30b14b48118a23eb4f540730d7cae2136a699d7a556e1a5b3c9f61709 WatchSource:0}: Error finding container f5e04ad30b14b48118a23eb4f540730d7cae2136a699d7a556e1a5b3c9f61709: Status 404 returned error can't find the container with id f5e04ad30b14b48118a23eb4f540730d7cae2136a699d7a556e1a5b3c9f61709 Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.417660 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-config-data\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.417733 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.417777 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.417841 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.417895 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.417915 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.417940 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.417967 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.418018 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.418044 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t66m\" (UniqueName: \"kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-kube-api-access-7t66m\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.418073 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.418635 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-config-data\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.418680 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.419348 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.422623 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.422979 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.424096 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.426337 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.429060 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.455900 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.458695 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t66m\" (UniqueName: \"kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-kube-api-access-7t66m\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.468808 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.497869 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.499531 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.506798 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.522199 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.522356 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.522677 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.522218 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-42dx6" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.522832 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.523029 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.523307 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.628003 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.628218 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.628269 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.628341 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.628387 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzbxl\" (UniqueName: \"kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-kube-api-access-wzbxl\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.628426 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.628459 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.628495 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.628523 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.628576 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.628646 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.730647 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.730717 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.730738 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.730767 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.730793 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzbxl\" (UniqueName: \"kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-kube-api-access-wzbxl\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.730822 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.730846 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.730871 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.730886 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.730923 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.730953 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.731523 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.731728 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.732263 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.733328 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.733788 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.733890 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.734778 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.734835 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.737482 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.741088 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.757545 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzbxl\" (UniqueName: \"kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-kube-api-access-wzbxl\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.775641 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.845709 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.847896 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" event={"ID":"408d75b6-d5bf-4dfe-8a22-7ba887229cac","Type":"ContainerStarted","Data":"7780156ed9f0082ec68a0f78b4e143e0933b9cd2871b2967918905066257d5ae"} Dec 08 09:21:04 crc kubenswrapper[4776]: I1208 09:21:04.850199 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" event={"ID":"639910a7-1d35-4535-b629-18fe52dacac3","Type":"ContainerStarted","Data":"f5e04ad30b14b48118a23eb4f540730d7cae2136a699d7a556e1a5b3c9f61709"} Dec 08 09:21:05 crc kubenswrapper[4776]: I1208 09:21:05.065574 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " pod="openstack/rabbitmq-server-0" Dec 08 09:21:05 crc kubenswrapper[4776]: I1208 09:21:05.125084 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 08 09:21:05 crc kubenswrapper[4776]: I1208 09:21:05.673043 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 09:21:05 crc kubenswrapper[4776]: I1208 09:21:05.840939 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.152715 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.155629 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.158644 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-hcbhs" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.158644 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.159089 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.159854 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.168936 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.178004 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.274525 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df4120e-0e93-4000-8b6a-7823f3e89dac-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.274603 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6bbb\" (UniqueName: \"kubernetes.io/projected/7df4120e-0e93-4000-8b6a-7823f3e89dac-kube-api-access-s6bbb\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.274634 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7df4120e-0e93-4000-8b6a-7823f3e89dac-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.274670 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7df4120e-0e93-4000-8b6a-7823f3e89dac-kolla-config\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.274697 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.274721 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7df4120e-0e93-4000-8b6a-7823f3e89dac-config-data-default\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.274799 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7df4120e-0e93-4000-8b6a-7823f3e89dac-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.274859 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df4120e-0e93-4000-8b6a-7823f3e89dac-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.379521 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df4120e-0e93-4000-8b6a-7823f3e89dac-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.382278 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df4120e-0e93-4000-8b6a-7823f3e89dac-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.383054 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df4120e-0e93-4000-8b6a-7823f3e89dac-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.383297 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6bbb\" (UniqueName: \"kubernetes.io/projected/7df4120e-0e93-4000-8b6a-7823f3e89dac-kube-api-access-s6bbb\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.383448 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7df4120e-0e93-4000-8b6a-7823f3e89dac-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.383667 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7df4120e-0e93-4000-8b6a-7823f3e89dac-kolla-config\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.383783 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.384035 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7df4120e-0e93-4000-8b6a-7823f3e89dac-config-data-default\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.384316 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7df4120e-0e93-4000-8b6a-7823f3e89dac-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.384478 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.387984 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7df4120e-0e93-4000-8b6a-7823f3e89dac-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.388753 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7df4120e-0e93-4000-8b6a-7823f3e89dac-kolla-config\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.388974 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7df4120e-0e93-4000-8b6a-7823f3e89dac-config-data-default\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.396282 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7df4120e-0e93-4000-8b6a-7823f3e89dac-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.397546 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df4120e-0e93-4000-8b6a-7823f3e89dac-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.412125 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6bbb\" (UniqueName: \"kubernetes.io/projected/7df4120e-0e93-4000-8b6a-7823f3e89dac-kube-api-access-s6bbb\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.412544 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"7df4120e-0e93-4000-8b6a-7823f3e89dac\") " pod="openstack/openstack-galera-0" Dec 08 09:21:06 crc kubenswrapper[4776]: I1208 09:21:06.486319 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.599649 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.601513 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.614590 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.614800 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.614901 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-652tx" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.614915 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.623745 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.714844 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/425d947a-2a85-4a03-853f-a60f54515a57-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.714896 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/425d947a-2a85-4a03-853f-a60f54515a57-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.715015 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdqvx\" (UniqueName: \"kubernetes.io/projected/425d947a-2a85-4a03-853f-a60f54515a57-kube-api-access-hdqvx\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.715131 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/425d947a-2a85-4a03-853f-a60f54515a57-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.715235 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425d947a-2a85-4a03-853f-a60f54515a57-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.715339 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/425d947a-2a85-4a03-853f-a60f54515a57-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.715388 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.715411 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/425d947a-2a85-4a03-853f-a60f54515a57-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.816869 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/425d947a-2a85-4a03-853f-a60f54515a57-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.817132 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/425d947a-2a85-4a03-853f-a60f54515a57-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.817262 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdqvx\" (UniqueName: \"kubernetes.io/projected/425d947a-2a85-4a03-853f-a60f54515a57-kube-api-access-hdqvx\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.817365 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/425d947a-2a85-4a03-853f-a60f54515a57-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.817436 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425d947a-2a85-4a03-853f-a60f54515a57-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.817530 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/425d947a-2a85-4a03-853f-a60f54515a57-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.817636 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.817703 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/425d947a-2a85-4a03-853f-a60f54515a57-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.817751 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/425d947a-2a85-4a03-853f-a60f54515a57-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.817804 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/425d947a-2a85-4a03-853f-a60f54515a57-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.818003 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.818118 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/425d947a-2a85-4a03-853f-a60f54515a57-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.818960 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/425d947a-2a85-4a03-853f-a60f54515a57-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.827806 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/425d947a-2a85-4a03-853f-a60f54515a57-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.827943 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425d947a-2a85-4a03-853f-a60f54515a57-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.845515 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdqvx\" (UniqueName: \"kubernetes.io/projected/425d947a-2a85-4a03-853f-a60f54515a57-kube-api-access-hdqvx\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.849661 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"425d947a-2a85-4a03-853f-a60f54515a57\") " pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.933740 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.935133 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.936639 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.938186 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.938506 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.942345 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-7vvf5" Dec 08 09:21:07 crc kubenswrapper[4776]: I1208 09:21:07.944573 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 08 09:21:08 crc kubenswrapper[4776]: I1208 09:21:08.024330 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/981d14af-244f-4679-975d-58e11df95718-memcached-tls-certs\") pod \"memcached-0\" (UID: \"981d14af-244f-4679-975d-58e11df95718\") " pod="openstack/memcached-0" Dec 08 09:21:08 crc kubenswrapper[4776]: I1208 09:21:08.024386 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981d14af-244f-4679-975d-58e11df95718-combined-ca-bundle\") pod \"memcached-0\" (UID: \"981d14af-244f-4679-975d-58e11df95718\") " pod="openstack/memcached-0" Dec 08 09:21:08 crc kubenswrapper[4776]: I1208 09:21:08.024416 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7tfb\" (UniqueName: \"kubernetes.io/projected/981d14af-244f-4679-975d-58e11df95718-kube-api-access-m7tfb\") pod \"memcached-0\" (UID: \"981d14af-244f-4679-975d-58e11df95718\") " pod="openstack/memcached-0" Dec 08 09:21:08 crc kubenswrapper[4776]: I1208 09:21:08.024444 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/981d14af-244f-4679-975d-58e11df95718-kolla-config\") pod \"memcached-0\" (UID: \"981d14af-244f-4679-975d-58e11df95718\") " pod="openstack/memcached-0" Dec 08 09:21:08 crc kubenswrapper[4776]: I1208 09:21:08.024550 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/981d14af-244f-4679-975d-58e11df95718-config-data\") pod \"memcached-0\" (UID: \"981d14af-244f-4679-975d-58e11df95718\") " pod="openstack/memcached-0" Dec 08 09:21:08 crc kubenswrapper[4776]: I1208 09:21:08.126456 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7tfb\" (UniqueName: \"kubernetes.io/projected/981d14af-244f-4679-975d-58e11df95718-kube-api-access-m7tfb\") pod \"memcached-0\" (UID: \"981d14af-244f-4679-975d-58e11df95718\") " pod="openstack/memcached-0" Dec 08 09:21:08 crc kubenswrapper[4776]: I1208 09:21:08.126518 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/981d14af-244f-4679-975d-58e11df95718-kolla-config\") pod \"memcached-0\" (UID: \"981d14af-244f-4679-975d-58e11df95718\") " pod="openstack/memcached-0" Dec 08 09:21:08 crc kubenswrapper[4776]: I1208 09:21:08.126603 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/981d14af-244f-4679-975d-58e11df95718-config-data\") pod \"memcached-0\" (UID: \"981d14af-244f-4679-975d-58e11df95718\") " pod="openstack/memcached-0" Dec 08 09:21:08 crc kubenswrapper[4776]: I1208 09:21:08.126681 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/981d14af-244f-4679-975d-58e11df95718-memcached-tls-certs\") pod \"memcached-0\" (UID: \"981d14af-244f-4679-975d-58e11df95718\") " pod="openstack/memcached-0" Dec 08 09:21:08 crc kubenswrapper[4776]: I1208 09:21:08.126702 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981d14af-244f-4679-975d-58e11df95718-combined-ca-bundle\") pod \"memcached-0\" (UID: \"981d14af-244f-4679-975d-58e11df95718\") " pod="openstack/memcached-0" Dec 08 09:21:08 crc kubenswrapper[4776]: I1208 09:21:08.127552 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/981d14af-244f-4679-975d-58e11df95718-config-data\") pod \"memcached-0\" (UID: \"981d14af-244f-4679-975d-58e11df95718\") " pod="openstack/memcached-0" Dec 08 09:21:08 crc kubenswrapper[4776]: I1208 09:21:08.127727 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/981d14af-244f-4679-975d-58e11df95718-kolla-config\") pod \"memcached-0\" (UID: \"981d14af-244f-4679-975d-58e11df95718\") " pod="openstack/memcached-0" Dec 08 09:21:08 crc kubenswrapper[4776]: I1208 09:21:08.132728 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/981d14af-244f-4679-975d-58e11df95718-memcached-tls-certs\") pod \"memcached-0\" (UID: \"981d14af-244f-4679-975d-58e11df95718\") " pod="openstack/memcached-0" Dec 08 09:21:08 crc kubenswrapper[4776]: I1208 09:21:08.133398 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981d14af-244f-4679-975d-58e11df95718-combined-ca-bundle\") pod \"memcached-0\" (UID: \"981d14af-244f-4679-975d-58e11df95718\") " pod="openstack/memcached-0" Dec 08 09:21:08 crc kubenswrapper[4776]: I1208 09:21:08.151604 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7tfb\" (UniqueName: \"kubernetes.io/projected/981d14af-244f-4679-975d-58e11df95718-kube-api-access-m7tfb\") pod \"memcached-0\" (UID: \"981d14af-244f-4679-975d-58e11df95718\") " pod="openstack/memcached-0" Dec 08 09:21:08 crc kubenswrapper[4776]: I1208 09:21:08.254752 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 08 09:21:09 crc kubenswrapper[4776]: I1208 09:21:09.666351 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 09:21:09 crc kubenswrapper[4776]: I1208 09:21:09.668001 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 08 09:21:09 crc kubenswrapper[4776]: I1208 09:21:09.675129 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-t26x4" Dec 08 09:21:09 crc kubenswrapper[4776]: I1208 09:21:09.696997 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 09:21:09 crc kubenswrapper[4776]: I1208 09:21:09.756074 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnwx7\" (UniqueName: \"kubernetes.io/projected/c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced-kube-api-access-wnwx7\") pod \"kube-state-metrics-0\" (UID: \"c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced\") " pod="openstack/kube-state-metrics-0" Dec 08 09:21:09 crc kubenswrapper[4776]: I1208 09:21:09.858316 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnwx7\" (UniqueName: \"kubernetes.io/projected/c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced-kube-api-access-wnwx7\") pod \"kube-state-metrics-0\" (UID: \"c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced\") " pod="openstack/kube-state-metrics-0" Dec 08 09:21:09 crc kubenswrapper[4776]: I1208 09:21:09.897138 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnwx7\" (UniqueName: \"kubernetes.io/projected/c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced-kube-api-access-wnwx7\") pod \"kube-state-metrics-0\" (UID: \"c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced\") " pod="openstack/kube-state-metrics-0" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.024076 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.304809 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-w69rl"] Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.306813 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-w69rl" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.309965 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.310250 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-j896c" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.321396 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-w69rl"] Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.367470 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfgvr\" (UniqueName: \"kubernetes.io/projected/251557fb-f870-4b8c-8725-648a8cd97fca-kube-api-access-kfgvr\") pod \"observability-ui-dashboards-7d5fb4cbfb-w69rl\" (UID: \"251557fb-f870-4b8c-8725-648a8cd97fca\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-w69rl" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.367759 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/251557fb-f870-4b8c-8725-648a8cd97fca-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-w69rl\" (UID: \"251557fb-f870-4b8c-8725-648a8cd97fca\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-w69rl" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.469793 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfgvr\" (UniqueName: \"kubernetes.io/projected/251557fb-f870-4b8c-8725-648a8cd97fca-kube-api-access-kfgvr\") pod \"observability-ui-dashboards-7d5fb4cbfb-w69rl\" (UID: \"251557fb-f870-4b8c-8725-648a8cd97fca\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-w69rl" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.469878 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/251557fb-f870-4b8c-8725-648a8cd97fca-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-w69rl\" (UID: \"251557fb-f870-4b8c-8725-648a8cd97fca\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-w69rl" Dec 08 09:21:10 crc kubenswrapper[4776]: E1208 09:21:10.470115 4776 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Dec 08 09:21:10 crc kubenswrapper[4776]: E1208 09:21:10.470231 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/251557fb-f870-4b8c-8725-648a8cd97fca-serving-cert podName:251557fb-f870-4b8c-8725-648a8cd97fca nodeName:}" failed. No retries permitted until 2025-12-08 09:21:10.970208595 +0000 UTC m=+1347.233433617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/251557fb-f870-4b8c-8725-648a8cd97fca-serving-cert") pod "observability-ui-dashboards-7d5fb4cbfb-w69rl" (UID: "251557fb-f870-4b8c-8725-648a8cd97fca") : secret "observability-ui-dashboards" not found Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.496342 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfgvr\" (UniqueName: \"kubernetes.io/projected/251557fb-f870-4b8c-8725-648a8cd97fca-kube-api-access-kfgvr\") pod \"observability-ui-dashboards-7d5fb4cbfb-w69rl\" (UID: \"251557fb-f870-4b8c-8725-648a8cd97fca\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-w69rl" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.636035 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f7b6d8f8f-9sr59"] Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.637858 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.672809 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hn7d\" (UniqueName: \"kubernetes.io/projected/89015ec5-de1a-4514-afe6-d60fd61e9f79-kube-api-access-8hn7d\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.673811 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/89015ec5-de1a-4514-afe6-d60fd61e9f79-oauth-serving-cert\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.673949 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/89015ec5-de1a-4514-afe6-d60fd61e9f79-console-serving-cert\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.674054 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/89015ec5-de1a-4514-afe6-d60fd61e9f79-console-oauth-config\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.674157 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/89015ec5-de1a-4514-afe6-d60fd61e9f79-console-config\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.674260 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89015ec5-de1a-4514-afe6-d60fd61e9f79-service-ca\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.674380 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89015ec5-de1a-4514-afe6-d60fd61e9f79-trusted-ca-bundle\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.676337 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f7b6d8f8f-9sr59"] Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.775752 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/89015ec5-de1a-4514-afe6-d60fd61e9f79-oauth-serving-cert\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.775889 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/89015ec5-de1a-4514-afe6-d60fd61e9f79-console-serving-cert\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.775922 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/89015ec5-de1a-4514-afe6-d60fd61e9f79-console-oauth-config\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.775963 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/89015ec5-de1a-4514-afe6-d60fd61e9f79-console-config\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.776004 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89015ec5-de1a-4514-afe6-d60fd61e9f79-service-ca\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.776041 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89015ec5-de1a-4514-afe6-d60fd61e9f79-trusted-ca-bundle\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.776093 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hn7d\" (UniqueName: \"kubernetes.io/projected/89015ec5-de1a-4514-afe6-d60fd61e9f79-kube-api-access-8hn7d\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.777488 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/89015ec5-de1a-4514-afe6-d60fd61e9f79-oauth-serving-cert\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.779091 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/89015ec5-de1a-4514-afe6-d60fd61e9f79-console-config\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.781684 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/89015ec5-de1a-4514-afe6-d60fd61e9f79-console-serving-cert\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.782853 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89015ec5-de1a-4514-afe6-d60fd61e9f79-trusted-ca-bundle\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.783014 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89015ec5-de1a-4514-afe6-d60fd61e9f79-service-ca\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.804792 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/89015ec5-de1a-4514-afe6-d60fd61e9f79-console-oauth-config\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.815250 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hn7d\" (UniqueName: \"kubernetes.io/projected/89015ec5-de1a-4514-afe6-d60fd61e9f79-kube-api-access-8hn7d\") pod \"console-7f7b6d8f8f-9sr59\" (UID: \"89015ec5-de1a-4514-afe6-d60fd61e9f79\") " pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.846405 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.849937 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.853828 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.861325 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.862843 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-lplpr" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.862996 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.871443 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.885317 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.889852 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.967054 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.981877 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.981957 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/786c0b37-638a-4b59-b149-628d9ad828bc-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.982017 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/251557fb-f870-4b8c-8725-648a8cd97fca-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-w69rl\" (UID: \"251557fb-f870-4b8c-8725-648a8cd97fca\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-w69rl" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.982066 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.982113 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.982226 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/786c0b37-638a-4b59-b149-628d9ad828bc-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.982277 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f96j4\" (UniqueName: \"kubernetes.io/projected/786c0b37-638a-4b59-b149-628d9ad828bc-kube-api-access-f96j4\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.982308 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-config\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.982372 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/786c0b37-638a-4b59-b149-628d9ad828bc-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:10 crc kubenswrapper[4776]: I1208 09:21:10.996581 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/251557fb-f870-4b8c-8725-648a8cd97fca-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-w69rl\" (UID: \"251557fb-f870-4b8c-8725-648a8cd97fca\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-w69rl" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.084156 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.084250 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/786c0b37-638a-4b59-b149-628d9ad828bc-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.084317 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.084388 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.084820 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.085605 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/786c0b37-638a-4b59-b149-628d9ad828bc-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.085647 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f96j4\" (UniqueName: \"kubernetes.io/projected/786c0b37-638a-4b59-b149-628d9ad828bc-kube-api-access-f96j4\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.085679 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-config\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.085744 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/786c0b37-638a-4b59-b149-628d9ad828bc-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.088326 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/786c0b37-638a-4b59-b149-628d9ad828bc-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.091148 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.091733 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.092439 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/786c0b37-638a-4b59-b149-628d9ad828bc-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.092558 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-config\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.104261 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/786c0b37-638a-4b59-b149-628d9ad828bc-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.106859 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f96j4\" (UniqueName: \"kubernetes.io/projected/786c0b37-638a-4b59-b149-628d9ad828bc-kube-api-access-f96j4\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.118026 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"prometheus-metric-storage-0\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: W1208 09:21:11.169668 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda01574f0_d8c8_404a_b822_7ce8e0af6fd4.slice/crio-72226f35dd79fbed00fd55d2c8fc6c7e59294d88693b0a6062f98e065f138be9 WatchSource:0}: Error finding container 72226f35dd79fbed00fd55d2c8fc6c7e59294d88693b0a6062f98e065f138be9: Status 404 returned error can't find the container with id 72226f35dd79fbed00fd55d2c8fc6c7e59294d88693b0a6062f98e065f138be9 Dec 08 09:21:11 crc kubenswrapper[4776]: W1208 09:21:11.171293 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc4ca0fd_48d8_4d6a_a7d6_2c0ad2d78994.slice/crio-cfce493ab73c718f447310fe516c8662a1077f8594cd914ec1ae9c90bb2656c2 WatchSource:0}: Error finding container cfce493ab73c718f447310fe516c8662a1077f8594cd914ec1ae9c90bb2656c2: Status 404 returned error can't find the container with id cfce493ab73c718f447310fe516c8662a1077f8594cd914ec1ae9c90bb2656c2 Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.188396 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.242236 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-w69rl" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.398889 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.398945 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.950162 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994","Type":"ContainerStarted","Data":"cfce493ab73c718f447310fe516c8662a1077f8594cd914ec1ae9c90bb2656c2"} Dec 08 09:21:11 crc kubenswrapper[4776]: I1208 09:21:11.952068 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a01574f0-d8c8-404a-b822-7ce8e0af6fd4","Type":"ContainerStarted","Data":"72226f35dd79fbed00fd55d2c8fc6c7e59294d88693b0a6062f98e065f138be9"} Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.264518 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.267711 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.270732 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-kwg6p" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.270876 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.271066 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.271185 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.273447 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.294355 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.352304 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e4de746-d269-470c-b934-117aa4c73834-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.352704 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e4de746-d269-470c-b934-117aa4c73834-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.352786 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e4de746-d269-470c-b934-117aa4c73834-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.352881 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e4de746-d269-470c-b934-117aa4c73834-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.352909 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.353106 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j7l4\" (UniqueName: \"kubernetes.io/projected/0e4de746-d269-470c-b934-117aa4c73834-kube-api-access-8j7l4\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.353156 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4de746-d269-470c-b934-117aa4c73834-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.353238 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e4de746-d269-470c-b934-117aa4c73834-config\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.455339 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e4de746-d269-470c-b934-117aa4c73834-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.455548 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.455842 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4de746-d269-470c-b934-117aa4c73834-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.455950 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j7l4\" (UniqueName: \"kubernetes.io/projected/0e4de746-d269-470c-b934-117aa4c73834-kube-api-access-8j7l4\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.456058 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e4de746-d269-470c-b934-117aa4c73834-config\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.456210 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e4de746-d269-470c-b934-117aa4c73834-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.456313 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e4de746-d269-470c-b934-117aa4c73834-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.456440 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e4de746-d269-470c-b934-117aa4c73834-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.458701 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e4de746-d269-470c-b934-117aa4c73834-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.459157 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.464563 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e4de746-d269-470c-b934-117aa4c73834-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.465436 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e4de746-d269-470c-b934-117aa4c73834-config\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.465808 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e4de746-d269-470c-b934-117aa4c73834-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.476292 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4de746-d269-470c-b934-117aa4c73834-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.488687 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e4de746-d269-470c-b934-117aa4c73834-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.504265 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j7l4\" (UniqueName: \"kubernetes.io/projected/0e4de746-d269-470c-b934-117aa4c73834-kube-api-access-8j7l4\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.511433 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0e4de746-d269-470c-b934-117aa4c73834\") " pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.599575 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.689574 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.710578 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wpgmk"] Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.712460 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.718372 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.718632 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.718758 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-884ts" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.727449 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-5tfbk"] Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.730085 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.738566 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wpgmk"] Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.751566 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5tfbk"] Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.867304 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-ovn-controller-tls-certs\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.867391 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2k22\" (UniqueName: \"kubernetes.io/projected/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-kube-api-access-k2k22\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.867452 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l54c\" (UniqueName: \"kubernetes.io/projected/215a9444-a545-491d-9eb6-02d98baff784-kube-api-access-9l54c\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.867471 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-var-run-ovn\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.867499 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/215a9444-a545-491d-9eb6-02d98baff784-var-lib\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.867519 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-var-log-ovn\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.867580 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/215a9444-a545-491d-9eb6-02d98baff784-scripts\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.867603 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-combined-ca-bundle\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.867661 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/215a9444-a545-491d-9eb6-02d98baff784-var-log\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.867799 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-scripts\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.867853 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/215a9444-a545-491d-9eb6-02d98baff784-var-run\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.867880 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-var-run\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.867931 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/215a9444-a545-491d-9eb6-02d98baff784-etc-ovs\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.971222 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/215a9444-a545-491d-9eb6-02d98baff784-var-run\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.971270 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-var-run\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.971325 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/215a9444-a545-491d-9eb6-02d98baff784-etc-ovs\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.971396 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-ovn-controller-tls-certs\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.971419 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2k22\" (UniqueName: \"kubernetes.io/projected/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-kube-api-access-k2k22\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.971456 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l54c\" (UniqueName: \"kubernetes.io/projected/215a9444-a545-491d-9eb6-02d98baff784-kube-api-access-9l54c\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.971477 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-var-run-ovn\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.971499 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/215a9444-a545-491d-9eb6-02d98baff784-var-lib\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.971521 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-var-log-ovn\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.971557 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/215a9444-a545-491d-9eb6-02d98baff784-scripts\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.971580 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-combined-ca-bundle\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.971643 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/215a9444-a545-491d-9eb6-02d98baff784-var-log\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.971693 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-scripts\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.973774 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-scripts\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.974860 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/215a9444-a545-491d-9eb6-02d98baff784-var-run\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.974942 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-var-log-ovn\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.975112 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/215a9444-a545-491d-9eb6-02d98baff784-etc-ovs\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.975364 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-var-run\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.975537 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-var-run-ovn\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.975677 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/215a9444-a545-491d-9eb6-02d98baff784-var-lib\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.976664 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/215a9444-a545-491d-9eb6-02d98baff784-scripts\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.976804 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/215a9444-a545-491d-9eb6-02d98baff784-var-log\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.980224 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-ovn-controller-tls-certs\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.987034 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2k22\" (UniqueName: \"kubernetes.io/projected/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-kube-api-access-k2k22\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.990867 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a9a1b68-ec7e-4994-9bda-fd418747dbc5-combined-ca-bundle\") pod \"ovn-controller-wpgmk\" (UID: \"9a9a1b68-ec7e-4994-9bda-fd418747dbc5\") " pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:14 crc kubenswrapper[4776]: I1208 09:21:14.992245 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l54c\" (UniqueName: \"kubernetes.io/projected/215a9444-a545-491d-9eb6-02d98baff784-kube-api-access-9l54c\") pod \"ovn-controller-ovs-5tfbk\" (UID: \"215a9444-a545-491d-9eb6-02d98baff784\") " pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:15 crc kubenswrapper[4776]: I1208 09:21:15.108265 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:15 crc kubenswrapper[4776]: I1208 09:21:15.119854 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.607879 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.613883 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.619077 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-bmf5w" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.619463 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.619742 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.619944 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.647896 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.718362 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d941bbc-2271-4ec4-853f-57feaf6ace36-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.718512 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d941bbc-2271-4ec4-853f-57feaf6ace36-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.718556 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghln6\" (UniqueName: \"kubernetes.io/projected/3d941bbc-2271-4ec4-853f-57feaf6ace36-kube-api-access-ghln6\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.718589 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.718616 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d941bbc-2271-4ec4-853f-57feaf6ace36-config\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.718648 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d941bbc-2271-4ec4-853f-57feaf6ace36-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.718678 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d941bbc-2271-4ec4-853f-57feaf6ace36-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.718756 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d941bbc-2271-4ec4-853f-57feaf6ace36-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.821088 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d941bbc-2271-4ec4-853f-57feaf6ace36-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.821151 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d941bbc-2271-4ec4-853f-57feaf6ace36-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.821255 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d941bbc-2271-4ec4-853f-57feaf6ace36-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.821343 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d941bbc-2271-4ec4-853f-57feaf6ace36-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.821437 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d941bbc-2271-4ec4-853f-57feaf6ace36-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.821470 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghln6\" (UniqueName: \"kubernetes.io/projected/3d941bbc-2271-4ec4-853f-57feaf6ace36-kube-api-access-ghln6\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.821499 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.821522 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d941bbc-2271-4ec4-853f-57feaf6ace36-config\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.822752 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.822761 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d941bbc-2271-4ec4-853f-57feaf6ace36-config\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.823560 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d941bbc-2271-4ec4-853f-57feaf6ace36-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.823584 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d941bbc-2271-4ec4-853f-57feaf6ace36-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.826924 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d941bbc-2271-4ec4-853f-57feaf6ace36-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.826996 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d941bbc-2271-4ec4-853f-57feaf6ace36-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.839050 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d941bbc-2271-4ec4-853f-57feaf6ace36-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.840571 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghln6\" (UniqueName: \"kubernetes.io/projected/3d941bbc-2271-4ec4-853f-57feaf6ace36-kube-api-access-ghln6\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.854132 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3d941bbc-2271-4ec4-853f-57feaf6ace36\") " pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:16 crc kubenswrapper[4776]: I1208 09:21:16.936737 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 08 09:21:20 crc kubenswrapper[4776]: I1208 09:21:20.706108 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 08 09:21:21 crc kubenswrapper[4776]: I1208 09:21:21.061290 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced","Type":"ContainerStarted","Data":"beca45767eb528dc45d690740fbcb90c6a2db5a9bff33c7d2ff81af1017e1e80"} Dec 08 09:21:22 crc kubenswrapper[4776]: W1208 09:21:22.679140 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod425d947a_2a85_4a03_853f_a60f54515a57.slice/crio-b10ea4522e7a36977fe26336a22345fab0964e984f33cd87512ad7474797668d WatchSource:0}: Error finding container b10ea4522e7a36977fe26336a22345fab0964e984f33cd87512ad7474797668d: Status 404 returned error can't find the container with id b10ea4522e7a36977fe26336a22345fab0964e984f33cd87512ad7474797668d Dec 08 09:21:22 crc kubenswrapper[4776]: E1208 09:21:22.697225 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 08 09:21:22 crc kubenswrapper[4776]: E1208 09:21:22.697363 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t8k97,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-dpq7z_openstack(d119edfa-fe1a-4db4-bcd9-370adc8e9d0a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:21:22 crc kubenswrapper[4776]: E1208 09:21:22.703505 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-dpq7z" podUID="d119edfa-fe1a-4db4-bcd9-370adc8e9d0a" Dec 08 09:21:22 crc kubenswrapper[4776]: E1208 09:21:22.719726 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 08 09:21:22 crc kubenswrapper[4776]: E1208 09:21:22.719887 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ktb54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-vwtnw_openstack(998af490-d762-4ddf-a8ae-be4b9a3da989): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:21:22 crc kubenswrapper[4776]: E1208 09:21:22.721944 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-vwtnw" podUID="998af490-d762-4ddf-a8ae-be4b9a3da989" Dec 08 09:21:23 crc kubenswrapper[4776]: I1208 09:21:23.095438 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"425d947a-2a85-4a03-853f-a60f54515a57","Type":"ContainerStarted","Data":"b10ea4522e7a36977fe26336a22345fab0964e984f33cd87512ad7474797668d"} Dec 08 09:21:23 crc kubenswrapper[4776]: I1208 09:21:23.226005 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f7b6d8f8f-9sr59"] Dec 08 09:21:23 crc kubenswrapper[4776]: I1208 09:21:23.909093 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-w69rl"] Dec 08 09:21:23 crc kubenswrapper[4776]: I1208 09:21:23.918190 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.083488 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.098208 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.115532 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f7b6d8f8f-9sr59" event={"ID":"89015ec5-de1a-4514-afe6-d60fd61e9f79","Type":"ContainerStarted","Data":"b2c05c01dc99e9656b2e4055daee7a64f24505da09c0ac91dc143920da66c651"} Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.118560 4776 generic.go:334] "Generic (PLEG): container finished" podID="408d75b6-d5bf-4dfe-8a22-7ba887229cac" containerID="acc5c399ebdf8fe7cef1d2ee8c8874fbdeacd7e807cfb47feeed26b61ce4cf10" exitCode=0 Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.118686 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" event={"ID":"408d75b6-d5bf-4dfe-8a22-7ba887229cac","Type":"ContainerDied","Data":"acc5c399ebdf8fe7cef1d2ee8c8874fbdeacd7e807cfb47feeed26b61ce4cf10"} Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.283798 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wpgmk"] Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.334796 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.437390 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.484891 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dpq7z" Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.651258 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8k97\" (UniqueName: \"kubernetes.io/projected/d119edfa-fe1a-4db4-bcd9-370adc8e9d0a-kube-api-access-t8k97\") pod \"d119edfa-fe1a-4db4-bcd9-370adc8e9d0a\" (UID: \"d119edfa-fe1a-4db4-bcd9-370adc8e9d0a\") " Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.651377 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d119edfa-fe1a-4db4-bcd9-370adc8e9d0a-config\") pod \"d119edfa-fe1a-4db4-bcd9-370adc8e9d0a\" (UID: \"d119edfa-fe1a-4db4-bcd9-370adc8e9d0a\") " Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.652352 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d119edfa-fe1a-4db4-bcd9-370adc8e9d0a-config" (OuterVolumeSpecName: "config") pod "d119edfa-fe1a-4db4-bcd9-370adc8e9d0a" (UID: "d119edfa-fe1a-4db4-bcd9-370adc8e9d0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.665767 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d119edfa-fe1a-4db4-bcd9-370adc8e9d0a-kube-api-access-t8k97" (OuterVolumeSpecName: "kube-api-access-t8k97") pod "d119edfa-fe1a-4db4-bcd9-370adc8e9d0a" (UID: "d119edfa-fe1a-4db4-bcd9-370adc8e9d0a"). InnerVolumeSpecName "kube-api-access-t8k97". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.754761 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8k97\" (UniqueName: \"kubernetes.io/projected/d119edfa-fe1a-4db4-bcd9-370adc8e9d0a-kube-api-access-t8k97\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.754809 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d119edfa-fe1a-4db4-bcd9-370adc8e9d0a-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.800575 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vwtnw" Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.959609 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktb54\" (UniqueName: \"kubernetes.io/projected/998af490-d762-4ddf-a8ae-be4b9a3da989-kube-api-access-ktb54\") pod \"998af490-d762-4ddf-a8ae-be4b9a3da989\" (UID: \"998af490-d762-4ddf-a8ae-be4b9a3da989\") " Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.959772 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/998af490-d762-4ddf-a8ae-be4b9a3da989-config\") pod \"998af490-d762-4ddf-a8ae-be4b9a3da989\" (UID: \"998af490-d762-4ddf-a8ae-be4b9a3da989\") " Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.959836 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/998af490-d762-4ddf-a8ae-be4b9a3da989-dns-svc\") pod \"998af490-d762-4ddf-a8ae-be4b9a3da989\" (UID: \"998af490-d762-4ddf-a8ae-be4b9a3da989\") " Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.961035 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/998af490-d762-4ddf-a8ae-be4b9a3da989-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "998af490-d762-4ddf-a8ae-be4b9a3da989" (UID: "998af490-d762-4ddf-a8ae-be4b9a3da989"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.961567 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/998af490-d762-4ddf-a8ae-be4b9a3da989-config" (OuterVolumeSpecName: "config") pod "998af490-d762-4ddf-a8ae-be4b9a3da989" (UID: "998af490-d762-4ddf-a8ae-be4b9a3da989"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:21:24 crc kubenswrapper[4776]: I1208 09:21:24.964298 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/998af490-d762-4ddf-a8ae-be4b9a3da989-kube-api-access-ktb54" (OuterVolumeSpecName: "kube-api-access-ktb54") pod "998af490-d762-4ddf-a8ae-be4b9a3da989" (UID: "998af490-d762-4ddf-a8ae-be4b9a3da989"). InnerVolumeSpecName "kube-api-access-ktb54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.061834 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktb54\" (UniqueName: \"kubernetes.io/projected/998af490-d762-4ddf-a8ae-be4b9a3da989-kube-api-access-ktb54\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.061869 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/998af490-d762-4ddf-a8ae-be4b9a3da989-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.061881 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/998af490-d762-4ddf-a8ae-be4b9a3da989-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.111677 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5tfbk"] Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.129861 4776 generic.go:334] "Generic (PLEG): container finished" podID="639910a7-1d35-4535-b629-18fe52dacac3" containerID="95a4e3da775ed1ad21f34cbb0678b1fb4d2f998e946648eca2f6eafd5d97adce" exitCode=0 Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.130074 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" event={"ID":"639910a7-1d35-4535-b629-18fe52dacac3","Type":"ContainerDied","Data":"95a4e3da775ed1ad21f34cbb0678b1fb4d2f998e946648eca2f6eafd5d97adce"} Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.139416 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994","Type":"ContainerStarted","Data":"4c055d7aed43594abdf15af1327c7f77c2e5b1d61e62e8c3406e155a3f4672e3"} Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.142610 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a01574f0-d8c8-404a-b822-7ce8e0af6fd4","Type":"ContainerStarted","Data":"b6fb7c3067a1dc9a57114d5f89cfbc05711c41bf27c557af79d1dbb9fcb89acd"} Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.144658 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3d941bbc-2271-4ec4-853f-57feaf6ace36","Type":"ContainerStarted","Data":"6756495c6a89b14fd753fd1d1857a3ef81b5ffa607bc153190bb2ff5e87b020f"} Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.147496 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"981d14af-244f-4679-975d-58e11df95718","Type":"ContainerStarted","Data":"3766dae377c3df43c6cdf089790829b56618e1eb5ffc7848576ab4bf108958b0"} Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.155382 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-w69rl" event={"ID":"251557fb-f870-4b8c-8725-648a8cd97fca","Type":"ContainerStarted","Data":"6accc246f15a415a5d7a99294962e06355e4d541213b2b6309bae2fd26c2c379"} Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.157674 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dpq7z" event={"ID":"d119edfa-fe1a-4db4-bcd9-370adc8e9d0a","Type":"ContainerDied","Data":"70273d65493b8fe92bec6aea20839d73faaef35ffa87fe167c4cda69185f8527"} Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.157790 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dpq7z" Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.181844 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-vwtnw" event={"ID":"998af490-d762-4ddf-a8ae-be4b9a3da989","Type":"ContainerDied","Data":"ed276c9e9dd4877b4a99ec95924cb8c795201be0db44717568e2e5d11c5d9ab7"} Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.181935 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vwtnw" Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.184290 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0e4de746-d269-470c-b934-117aa4c73834","Type":"ContainerStarted","Data":"76cce8dd975912661a203fc9733213a2c1ac1005f0e09464d76b49fff9bf04fd"} Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.188564 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7df4120e-0e93-4000-8b6a-7823f3e89dac","Type":"ContainerStarted","Data":"e48d938af3c69882c9b825e2f0fdece12a82cef954e77ede5d5181dac928bc26"} Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.196215 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"786c0b37-638a-4b59-b149-628d9ad828bc","Type":"ContainerStarted","Data":"b20b4c2bb3bcbc19107993d351b26cc2cb3321013049808d939c340dc34f8093"} Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.206578 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpgmk" event={"ID":"9a9a1b68-ec7e-4994-9bda-fd418747dbc5","Type":"ContainerStarted","Data":"80d87a5826d96cbc10918073120f98631f4463d3c3eea99ca214037e2511aa2b"} Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.257236 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dpq7z"] Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.268120 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dpq7z"] Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.294410 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vwtnw"] Dec 08 09:21:25 crc kubenswrapper[4776]: I1208 09:21:25.310705 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vwtnw"] Dec 08 09:21:26 crc kubenswrapper[4776]: I1208 09:21:26.217488 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f7b6d8f8f-9sr59" event={"ID":"89015ec5-de1a-4514-afe6-d60fd61e9f79","Type":"ContainerStarted","Data":"f639267dee1eab2ee4c4159efbda2a728d42683cb37b8f2e025a8283d356d242"} Dec 08 09:21:26 crc kubenswrapper[4776]: I1208 09:21:26.219002 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5tfbk" event={"ID":"215a9444-a545-491d-9eb6-02d98baff784","Type":"ContainerStarted","Data":"5e940caf9313190ec45c73b205e97c07683818975cf37780e31eab8f1c14a746"} Dec 08 09:21:26 crc kubenswrapper[4776]: I1208 09:21:26.244393 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f7b6d8f8f-9sr59" podStartSLOduration=16.244367744 podStartE2EDuration="16.244367744s" podCreationTimestamp="2025-12-08 09:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:21:26.236657797 +0000 UTC m=+1362.499882819" watchObservedRunningTime="2025-12-08 09:21:26.244367744 +0000 UTC m=+1362.507592766" Dec 08 09:21:26 crc kubenswrapper[4776]: I1208 09:21:26.355730 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="998af490-d762-4ddf-a8ae-be4b9a3da989" path="/var/lib/kubelet/pods/998af490-d762-4ddf-a8ae-be4b9a3da989/volumes" Dec 08 09:21:26 crc kubenswrapper[4776]: I1208 09:21:26.356101 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d119edfa-fe1a-4db4-bcd9-370adc8e9d0a" path="/var/lib/kubelet/pods/d119edfa-fe1a-4db4-bcd9-370adc8e9d0a/volumes" Dec 08 09:21:30 crc kubenswrapper[4776]: I1208 09:21:30.968930 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:30 crc kubenswrapper[4776]: I1208 09:21:30.969497 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:30 crc kubenswrapper[4776]: I1208 09:21:30.974319 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.283752 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" event={"ID":"408d75b6-d5bf-4dfe-8a22-7ba887229cac","Type":"ContainerStarted","Data":"af55f04f76e7c1c673e1bc91ad1bdf487a7511e129a136cc3114fdd075d9eef1"} Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.284530 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.286846 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" event={"ID":"639910a7-1d35-4535-b629-18fe52dacac3","Type":"ContainerStarted","Data":"79f9c555ec508b9fafcfc74237127a52faca425323d81a57441c56b165238912"} Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.286999 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.290977 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f7b6d8f8f-9sr59" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.303759 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" podStartSLOduration=9.333617503 podStartE2EDuration="28.303739831s" podCreationTimestamp="2025-12-08 09:21:03 +0000 UTC" firstStartedPulling="2025-12-08 09:21:03.99733851 +0000 UTC m=+1340.260563532" lastFinishedPulling="2025-12-08 09:21:22.967460838 +0000 UTC m=+1359.230685860" observedRunningTime="2025-12-08 09:21:31.299153447 +0000 UTC m=+1367.562378469" watchObservedRunningTime="2025-12-08 09:21:31.303739831 +0000 UTC m=+1367.566964853" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.354764 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" podStartSLOduration=9.728453543 podStartE2EDuration="28.354745759s" podCreationTimestamp="2025-12-08 09:21:03 +0000 UTC" firstStartedPulling="2025-12-08 09:21:04.340849113 +0000 UTC m=+1340.604074135" lastFinishedPulling="2025-12-08 09:21:22.967141329 +0000 UTC m=+1359.230366351" observedRunningTime="2025-12-08 09:21:31.351375089 +0000 UTC m=+1367.614600111" watchObservedRunningTime="2025-12-08 09:21:31.354745759 +0000 UTC m=+1367.617970781" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.371905 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-588757d595-b54s9"] Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.639048 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-zn4qk"] Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.640575 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.644546 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.667279 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-zn4qk"] Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.734718 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e843ce72-b4b1-4603-8876-05dc121793ed-ovs-rundir\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.734777 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e843ce72-b4b1-4603-8876-05dc121793ed-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.734884 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e843ce72-b4b1-4603-8876-05dc121793ed-ovn-rundir\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.734915 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e843ce72-b4b1-4603-8876-05dc121793ed-config\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.734942 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e843ce72-b4b1-4603-8876-05dc121793ed-combined-ca-bundle\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.735063 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czcsl\" (UniqueName: \"kubernetes.io/projected/e843ce72-b4b1-4603-8876-05dc121793ed-kube-api-access-czcsl\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.837757 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e843ce72-b4b1-4603-8876-05dc121793ed-ovn-rundir\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.837804 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e843ce72-b4b1-4603-8876-05dc121793ed-config\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.837828 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e843ce72-b4b1-4603-8876-05dc121793ed-combined-ca-bundle\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.837923 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czcsl\" (UniqueName: \"kubernetes.io/projected/e843ce72-b4b1-4603-8876-05dc121793ed-kube-api-access-czcsl\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.837950 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e843ce72-b4b1-4603-8876-05dc121793ed-ovs-rundir\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.837966 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e843ce72-b4b1-4603-8876-05dc121793ed-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.839542 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e843ce72-b4b1-4603-8876-05dc121793ed-ovn-rundir\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.839645 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e843ce72-b4b1-4603-8876-05dc121793ed-ovs-rundir\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.840570 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e843ce72-b4b1-4603-8876-05dc121793ed-config\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.844752 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e843ce72-b4b1-4603-8876-05dc121793ed-combined-ca-bundle\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.846838 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e843ce72-b4b1-4603-8876-05dc121793ed-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.857676 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czcsl\" (UniqueName: \"kubernetes.io/projected/e843ce72-b4b1-4603-8876-05dc121793ed-kube-api-access-czcsl\") pod \"ovn-controller-metrics-zn4qk\" (UID: \"e843ce72-b4b1-4603-8876-05dc121793ed\") " pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.974843 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9d9dp"] Dec 08 09:21:31 crc kubenswrapper[4776]: I1208 09:21:31.977245 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-zn4qk" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.007640 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-cc9cp"] Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.009677 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.012522 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.021848 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-cc9cp"] Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.143039 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-config\") pod \"dnsmasq-dns-7fd796d7df-cc9cp\" (UID: \"251ffea1-23ed-46e0-8683-0600c4176e26\") " pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.143085 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-cc9cp\" (UID: \"251ffea1-23ed-46e0-8683-0600c4176e26\") " pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.143234 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-cc9cp\" (UID: \"251ffea1-23ed-46e0-8683-0600c4176e26\") " pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.143282 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgkwz\" (UniqueName: \"kubernetes.io/projected/251ffea1-23ed-46e0-8683-0600c4176e26-kube-api-access-rgkwz\") pod \"dnsmasq-dns-7fd796d7df-cc9cp\" (UID: \"251ffea1-23ed-46e0-8683-0600c4176e26\") " pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.179451 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-n4jfg"] Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.213429 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kx9kd"] Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.215114 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.220198 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.243893 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kx9kd"] Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.244872 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-cc9cp\" (UID: \"251ffea1-23ed-46e0-8683-0600c4176e26\") " pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.244948 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgkwz\" (UniqueName: \"kubernetes.io/projected/251ffea1-23ed-46e0-8683-0600c4176e26-kube-api-access-rgkwz\") pod \"dnsmasq-dns-7fd796d7df-cc9cp\" (UID: \"251ffea1-23ed-46e0-8683-0600c4176e26\") " pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.245017 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-config\") pod \"dnsmasq-dns-7fd796d7df-cc9cp\" (UID: \"251ffea1-23ed-46e0-8683-0600c4176e26\") " pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.245041 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-cc9cp\" (UID: \"251ffea1-23ed-46e0-8683-0600c4176e26\") " pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.245881 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-cc9cp\" (UID: \"251ffea1-23ed-46e0-8683-0600c4176e26\") " pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.246566 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-config\") pod \"dnsmasq-dns-7fd796d7df-cc9cp\" (UID: \"251ffea1-23ed-46e0-8683-0600c4176e26\") " pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.246565 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-cc9cp\" (UID: \"251ffea1-23ed-46e0-8683-0600c4176e26\") " pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.280269 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgkwz\" (UniqueName: \"kubernetes.io/projected/251ffea1-23ed-46e0-8683-0600c4176e26-kube-api-access-rgkwz\") pod \"dnsmasq-dns-7fd796d7df-cc9cp\" (UID: \"251ffea1-23ed-46e0-8683-0600c4176e26\") " pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.341993 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.356455 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5znql\" (UniqueName: \"kubernetes.io/projected/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-kube-api-access-5znql\") pod \"dnsmasq-dns-86db49b7ff-kx9kd\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.356695 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-config\") pod \"dnsmasq-dns-86db49b7ff-kx9kd\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.356785 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-kx9kd\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.356810 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-kx9kd\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.357756 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-kx9kd\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.459609 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-kx9kd\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.459738 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5znql\" (UniqueName: \"kubernetes.io/projected/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-kube-api-access-5znql\") pod \"dnsmasq-dns-86db49b7ff-kx9kd\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.459893 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-config\") pod \"dnsmasq-dns-86db49b7ff-kx9kd\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.459949 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-kx9kd\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.459967 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-kx9kd\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.461896 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-kx9kd\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.462257 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-config\") pod \"dnsmasq-dns-86db49b7ff-kx9kd\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.463108 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-kx9kd\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.463473 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-kx9kd\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.491815 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5znql\" (UniqueName: \"kubernetes.io/projected/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-kube-api-access-5znql\") pod \"dnsmasq-dns-86db49b7ff-kx9kd\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:32 crc kubenswrapper[4776]: I1208 09:21:32.546338 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:33 crc kubenswrapper[4776]: I1208 09:21:33.308512 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" podUID="639910a7-1d35-4535-b629-18fe52dacac3" containerName="dnsmasq-dns" containerID="cri-o://79f9c555ec508b9fafcfc74237127a52faca425323d81a57441c56b165238912" gracePeriod=10 Dec 08 09:21:33 crc kubenswrapper[4776]: I1208 09:21:33.308445 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" podUID="408d75b6-d5bf-4dfe-8a22-7ba887229cac" containerName="dnsmasq-dns" containerID="cri-o://af55f04f76e7c1c673e1bc91ad1bdf487a7511e129a136cc3114fdd075d9eef1" gracePeriod=10 Dec 08 09:21:34 crc kubenswrapper[4776]: I1208 09:21:34.319739 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" event={"ID":"408d75b6-d5bf-4dfe-8a22-7ba887229cac","Type":"ContainerDied","Data":"af55f04f76e7c1c673e1bc91ad1bdf487a7511e129a136cc3114fdd075d9eef1"} Dec 08 09:21:34 crc kubenswrapper[4776]: I1208 09:21:34.320048 4776 generic.go:334] "Generic (PLEG): container finished" podID="408d75b6-d5bf-4dfe-8a22-7ba887229cac" containerID="af55f04f76e7c1c673e1bc91ad1bdf487a7511e129a136cc3114fdd075d9eef1" exitCode=0 Dec 08 09:21:34 crc kubenswrapper[4776]: I1208 09:21:34.324620 4776 generic.go:334] "Generic (PLEG): container finished" podID="639910a7-1d35-4535-b629-18fe52dacac3" containerID="79f9c555ec508b9fafcfc74237127a52faca425323d81a57441c56b165238912" exitCode=0 Dec 08 09:21:34 crc kubenswrapper[4776]: I1208 09:21:34.324661 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" event={"ID":"639910a7-1d35-4535-b629-18fe52dacac3","Type":"ContainerDied","Data":"79f9c555ec508b9fafcfc74237127a52faca425323d81a57441c56b165238912"} Dec 08 09:21:38 crc kubenswrapper[4776]: I1208 09:21:38.431971 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" podUID="408d75b6-d5bf-4dfe-8a22-7ba887229cac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Dec 08 09:21:38 crc kubenswrapper[4776]: E1208 09:21:38.693269 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 08 09:21:38 crc kubenswrapper[4776]: E1208 09:21:38.693888 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n96h655hdch566h698h5d6hdfh566h54dh59ch67dh96h84h74h684h677h8fh645h9ch89h57fh56bh86h76h77h59bh5c7h657h67bh64fh58hccq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m7tfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(981d14af-244f-4679-975d-58e11df95718): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:21:38 crc kubenswrapper[4776]: E1208 09:21:38.695417 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="981d14af-244f-4679-975d-58e11df95718" Dec 08 09:21:38 crc kubenswrapper[4776]: I1208 09:21:38.743303 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" podUID="639910a7-1d35-4535-b629-18fe52dacac3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Dec 08 09:21:38 crc kubenswrapper[4776]: E1208 09:21:38.914982 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified" Dec 08 09:21:38 crc kubenswrapper[4776]: E1208 09:21:38.915349 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-sb,Image:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n564h97h5ddhb7h5c8h57bh8bh67h5b5h5d5h676h58bh599h686h66fh566h54ch68h65fhchb9hdfhb5h57ch97h9bhb5h56dhb4h69h77h654q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-sb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ghln6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(3d941bbc-2271-4ec4-853f-57feaf6ace36): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:21:39 crc kubenswrapper[4776]: E1208 09:21:39.082314 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Dec 08 09:21:39 crc kubenswrapper[4776]: E1208 09:21:39.082476 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n9bhb4h558h5b5h65chcch9h5fch584h58h5c8h68bhf4h6ch664h559hc8h68fhb4hf8hddh668h5f9h6dh9bh576h65dh59h658hcfh88h5d5q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k2k22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-wpgmk_openstack(9a9a1b68-ec7e-4994-9bda-fd418747dbc5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:21:39 crc kubenswrapper[4776]: E1208 09:21:39.083790 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-wpgmk" podUID="9a9a1b68-ec7e-4994-9bda-fd418747dbc5" Dec 08 09:21:39 crc kubenswrapper[4776]: E1208 09:21:39.384662 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-wpgmk" podUID="9a9a1b68-ec7e-4994-9bda-fd418747dbc5" Dec 08 09:21:39 crc kubenswrapper[4776]: E1208 09:21:39.399627 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="981d14af-244f-4679-975d-58e11df95718" Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.536813 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.620547 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhpzr\" (UniqueName: \"kubernetes.io/projected/639910a7-1d35-4535-b629-18fe52dacac3-kube-api-access-jhpzr\") pod \"639910a7-1d35-4535-b629-18fe52dacac3\" (UID: \"639910a7-1d35-4535-b629-18fe52dacac3\") " Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.620755 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/639910a7-1d35-4535-b629-18fe52dacac3-config\") pod \"639910a7-1d35-4535-b629-18fe52dacac3\" (UID: \"639910a7-1d35-4535-b629-18fe52dacac3\") " Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.620786 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/639910a7-1d35-4535-b629-18fe52dacac3-dns-svc\") pod \"639910a7-1d35-4535-b629-18fe52dacac3\" (UID: \"639910a7-1d35-4535-b629-18fe52dacac3\") " Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.628425 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/639910a7-1d35-4535-b629-18fe52dacac3-kube-api-access-jhpzr" (OuterVolumeSpecName: "kube-api-access-jhpzr") pod "639910a7-1d35-4535-b629-18fe52dacac3" (UID: "639910a7-1d35-4535-b629-18fe52dacac3"). InnerVolumeSpecName "kube-api-access-jhpzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.673666 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.724185 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhpzr\" (UniqueName: \"kubernetes.io/projected/639910a7-1d35-4535-b629-18fe52dacac3-kube-api-access-jhpzr\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.750374 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/639910a7-1d35-4535-b629-18fe52dacac3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "639910a7-1d35-4535-b629-18fe52dacac3" (UID: "639910a7-1d35-4535-b629-18fe52dacac3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.824891 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/408d75b6-d5bf-4dfe-8a22-7ba887229cac-dns-svc\") pod \"408d75b6-d5bf-4dfe-8a22-7ba887229cac\" (UID: \"408d75b6-d5bf-4dfe-8a22-7ba887229cac\") " Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.824961 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408d75b6-d5bf-4dfe-8a22-7ba887229cac-config\") pod \"408d75b6-d5bf-4dfe-8a22-7ba887229cac\" (UID: \"408d75b6-d5bf-4dfe-8a22-7ba887229cac\") " Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.825187 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnxbm\" (UniqueName: \"kubernetes.io/projected/408d75b6-d5bf-4dfe-8a22-7ba887229cac-kube-api-access-nnxbm\") pod \"408d75b6-d5bf-4dfe-8a22-7ba887229cac\" (UID: \"408d75b6-d5bf-4dfe-8a22-7ba887229cac\") " Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.825733 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/639910a7-1d35-4535-b629-18fe52dacac3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.876741 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/639910a7-1d35-4535-b629-18fe52dacac3-config" (OuterVolumeSpecName: "config") pod "639910a7-1d35-4535-b629-18fe52dacac3" (UID: "639910a7-1d35-4535-b629-18fe52dacac3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.880666 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408d75b6-d5bf-4dfe-8a22-7ba887229cac-kube-api-access-nnxbm" (OuterVolumeSpecName: "kube-api-access-nnxbm") pod "408d75b6-d5bf-4dfe-8a22-7ba887229cac" (UID: "408d75b6-d5bf-4dfe-8a22-7ba887229cac"). InnerVolumeSpecName "kube-api-access-nnxbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.926987 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnxbm\" (UniqueName: \"kubernetes.io/projected/408d75b6-d5bf-4dfe-8a22-7ba887229cac-kube-api-access-nnxbm\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.927019 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/639910a7-1d35-4535-b629-18fe52dacac3-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:39 crc kubenswrapper[4776]: I1208 09:21:39.954920 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/408d75b6-d5bf-4dfe-8a22-7ba887229cac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "408d75b6-d5bf-4dfe-8a22-7ba887229cac" (UID: "408d75b6-d5bf-4dfe-8a22-7ba887229cac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.004438 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-zn4qk"] Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.013567 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-cc9cp"] Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.043565 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.048450 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/408d75b6-d5bf-4dfe-8a22-7ba887229cac-config" (OuterVolumeSpecName: "config") pod "408d75b6-d5bf-4dfe-8a22-7ba887229cac" (UID: "408d75b6-d5bf-4dfe-8a22-7ba887229cac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.051553 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/408d75b6-d5bf-4dfe-8a22-7ba887229cac-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.051587 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408d75b6-d5bf-4dfe-8a22-7ba887229cac-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.054432 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kx9kd"] Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.395160 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-w69rl" event={"ID":"251557fb-f870-4b8c-8725-648a8cd97fca","Type":"ContainerStarted","Data":"c1dff5eb42dfdd36b9502616964f13efb76a83a5e786ed4d04f10d34e68b4e99"} Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.400314 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" event={"ID":"639910a7-1d35-4535-b629-18fe52dacac3","Type":"ContainerDied","Data":"f5e04ad30b14b48118a23eb4f540730d7cae2136a699d7a556e1a5b3c9f61709"} Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.400339 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9d9dp" Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.400406 4776 scope.go:117] "RemoveContainer" containerID="79f9c555ec508b9fafcfc74237127a52faca425323d81a57441c56b165238912" Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.402275 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" event={"ID":"251ffea1-23ed-46e0-8683-0600c4176e26","Type":"ContainerStarted","Data":"c6675fbaae0811d37f0acad25ef80f3136e33b7386042d01be84bd4f57677a89"} Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.402329 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" event={"ID":"251ffea1-23ed-46e0-8683-0600c4176e26","Type":"ContainerStarted","Data":"8726fcd2390b27f9fee709f9039dc2ee0cff95fc170a8ad1202ad6294e71088c"} Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.412404 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"425d947a-2a85-4a03-853f-a60f54515a57","Type":"ContainerStarted","Data":"4eda23ffa9b631ab6ab393bf163d0e1e5b7305f07ecbc0e973543bf7905d3354"} Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.415402 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-w69rl" podStartSLOduration=15.802536664 podStartE2EDuration="30.415346296s" podCreationTimestamp="2025-12-08 09:21:10 +0000 UTC" firstStartedPulling="2025-12-08 09:21:24.546917324 +0000 UTC m=+1360.810142346" lastFinishedPulling="2025-12-08 09:21:39.159726946 +0000 UTC m=+1375.422951978" observedRunningTime="2025-12-08 09:21:40.409475848 +0000 UTC m=+1376.672700870" watchObservedRunningTime="2025-12-08 09:21:40.415346296 +0000 UTC m=+1376.678571318" Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.420764 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7df4120e-0e93-4000-8b6a-7823f3e89dac","Type":"ContainerStarted","Data":"5d6a3c8d703f5d364e074bda2aa877f975b3e861980b28ae0cf17a0781266ab2"} Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.426652 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced","Type":"ContainerStarted","Data":"84c8fc46c043ccf7fee065dbe52ce4ffd0165e484045d856e3eb4532ff2ef2b7"} Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.427573 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.429933 4776 generic.go:334] "Generic (PLEG): container finished" podID="fd497a67-79f7-4a1a-b0de-eb6fdcc524ae" containerID="cd7f76bf86baaaf8da852eee2fe0fbc3f3b36843c6569306fa8aafd612e4fe45" exitCode=0 Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.429985 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" event={"ID":"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae","Type":"ContainerDied","Data":"cd7f76bf86baaaf8da852eee2fe0fbc3f3b36843c6569306fa8aafd612e4fe45"} Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.430007 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" event={"ID":"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae","Type":"ContainerStarted","Data":"281f530ea9c8c6457ebf54e535b09e5e50a73d483f1bc522592749c7c42280bf"} Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.435370 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5tfbk" event={"ID":"215a9444-a545-491d-9eb6-02d98baff784","Type":"ContainerStarted","Data":"ef0a8b5937bb2801996f0d288b8cf424f5aca2d840394828e08c51c92a4e0ee9"} Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.445886 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-zn4qk" event={"ID":"e843ce72-b4b1-4603-8876-05dc121793ed","Type":"ContainerStarted","Data":"eb5d8bb3364ecc697a05be82819bb43764f67a900eb71ff399ac2820865809b3"} Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.452645 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0e4de746-d269-470c-b934-117aa4c73834","Type":"ContainerStarted","Data":"4958669aab962b53bc9f52a065a353a032081324572f8c583e480904fceca7f2"} Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.459450 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" event={"ID":"408d75b6-d5bf-4dfe-8a22-7ba887229cac","Type":"ContainerDied","Data":"7780156ed9f0082ec68a0f78b4e143e0933b9cd2871b2967918905066257d5ae"} Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.459877 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-n4jfg" Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.506138 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=12.581804027 podStartE2EDuration="31.506115392s" podCreationTimestamp="2025-12-08 09:21:09 +0000 UTC" firstStartedPulling="2025-12-08 09:21:20.23648574 +0000 UTC m=+1356.499710762" lastFinishedPulling="2025-12-08 09:21:39.160797105 +0000 UTC m=+1375.424022127" observedRunningTime="2025-12-08 09:21:40.498699342 +0000 UTC m=+1376.761924384" watchObservedRunningTime="2025-12-08 09:21:40.506115392 +0000 UTC m=+1376.769340414" Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.743373 4776 scope.go:117] "RemoveContainer" containerID="95a4e3da775ed1ad21f34cbb0678b1fb4d2f998e946648eca2f6eafd5d97adce" Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.787041 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9d9dp"] Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.798461 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9d9dp"] Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.800562 4776 scope.go:117] "RemoveContainer" containerID="af55f04f76e7c1c673e1bc91ad1bdf487a7511e129a136cc3114fdd075d9eef1" Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.809300 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-n4jfg"] Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.818996 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-n4jfg"] Dec 08 09:21:40 crc kubenswrapper[4776]: I1208 09:21:40.820698 4776 scope.go:117] "RemoveContainer" containerID="acc5c399ebdf8fe7cef1d2ee8c8874fbdeacd7e807cfb47feeed26b61ce4cf10" Dec 08 09:21:41 crc kubenswrapper[4776]: I1208 09:21:41.398924 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:21:41 crc kubenswrapper[4776]: I1208 09:21:41.399218 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:21:41 crc kubenswrapper[4776]: I1208 09:21:41.399269 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 09:21:41 crc kubenswrapper[4776]: I1208 09:21:41.400090 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a5351febb0de8fddebf4555b73007dffb77eb52f317fae03ed23b485a212557"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 09:21:41 crc kubenswrapper[4776]: I1208 09:21:41.400147 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://6a5351febb0de8fddebf4555b73007dffb77eb52f317fae03ed23b485a212557" gracePeriod=600 Dec 08 09:21:41 crc kubenswrapper[4776]: I1208 09:21:41.478216 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" event={"ID":"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae","Type":"ContainerStarted","Data":"1d1a164851d77a165e18b51c820ff4b17f201a02f4bdfec74fb329f03f7b767b"} Dec 08 09:21:41 crc kubenswrapper[4776]: I1208 09:21:41.478411 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:41 crc kubenswrapper[4776]: I1208 09:21:41.481758 4776 generic.go:334] "Generic (PLEG): container finished" podID="251ffea1-23ed-46e0-8683-0600c4176e26" containerID="c6675fbaae0811d37f0acad25ef80f3136e33b7386042d01be84bd4f57677a89" exitCode=0 Dec 08 09:21:41 crc kubenswrapper[4776]: I1208 09:21:41.481851 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" event={"ID":"251ffea1-23ed-46e0-8683-0600c4176e26","Type":"ContainerDied","Data":"c6675fbaae0811d37f0acad25ef80f3136e33b7386042d01be84bd4f57677a89"} Dec 08 09:21:41 crc kubenswrapper[4776]: I1208 09:21:41.481877 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" event={"ID":"251ffea1-23ed-46e0-8683-0600c4176e26","Type":"ContainerStarted","Data":"575fcb517f5f578510692787fa56c0d8c39d03ee4f201ceb1897c0b8046ba7f9"} Dec 08 09:21:41 crc kubenswrapper[4776]: I1208 09:21:41.481967 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:41 crc kubenswrapper[4776]: I1208 09:21:41.487951 4776 generic.go:334] "Generic (PLEG): container finished" podID="215a9444-a545-491d-9eb6-02d98baff784" containerID="ef0a8b5937bb2801996f0d288b8cf424f5aca2d840394828e08c51c92a4e0ee9" exitCode=0 Dec 08 09:21:41 crc kubenswrapper[4776]: I1208 09:21:41.488077 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5tfbk" event={"ID":"215a9444-a545-491d-9eb6-02d98baff784","Type":"ContainerDied","Data":"ef0a8b5937bb2801996f0d288b8cf424f5aca2d840394828e08c51c92a4e0ee9"} Dec 08 09:21:41 crc kubenswrapper[4776]: I1208 09:21:41.488113 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5tfbk" event={"ID":"215a9444-a545-491d-9eb6-02d98baff784","Type":"ContainerStarted","Data":"6bf214aa04898fdc0ccd47e78e7a9d916111795b2783b6423faf36515a0167fa"} Dec 08 09:21:41 crc kubenswrapper[4776]: I1208 09:21:41.512300 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" podStartSLOduration=9.512271938 podStartE2EDuration="9.512271938s" podCreationTimestamp="2025-12-08 09:21:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:21:41.504080868 +0000 UTC m=+1377.767305890" watchObservedRunningTime="2025-12-08 09:21:41.512271938 +0000 UTC m=+1377.775496970" Dec 08 09:21:41 crc kubenswrapper[4776]: I1208 09:21:41.530165 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" podStartSLOduration=10.530139657 podStartE2EDuration="10.530139657s" podCreationTimestamp="2025-12-08 09:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:21:41.517781046 +0000 UTC m=+1377.781006068" watchObservedRunningTime="2025-12-08 09:21:41.530139657 +0000 UTC m=+1377.793364679" Dec 08 09:21:42 crc kubenswrapper[4776]: I1208 09:21:42.359879 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="408d75b6-d5bf-4dfe-8a22-7ba887229cac" path="/var/lib/kubelet/pods/408d75b6-d5bf-4dfe-8a22-7ba887229cac/volumes" Dec 08 09:21:42 crc kubenswrapper[4776]: I1208 09:21:42.360900 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="639910a7-1d35-4535-b629-18fe52dacac3" path="/var/lib/kubelet/pods/639910a7-1d35-4535-b629-18fe52dacac3/volumes" Dec 08 09:21:42 crc kubenswrapper[4776]: I1208 09:21:42.498948 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="6a5351febb0de8fddebf4555b73007dffb77eb52f317fae03ed23b485a212557" exitCode=0 Dec 08 09:21:42 crc kubenswrapper[4776]: I1208 09:21:42.499095 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"6a5351febb0de8fddebf4555b73007dffb77eb52f317fae03ed23b485a212557"} Dec 08 09:21:42 crc kubenswrapper[4776]: I1208 09:21:42.499152 4776 scope.go:117] "RemoveContainer" containerID="409acf0371b6644dc04fbcc1653de1b5f75319f5fcc98f856ec232671ed68b71" Dec 08 09:21:42 crc kubenswrapper[4776]: I1208 09:21:42.501264 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"786c0b37-638a-4b59-b149-628d9ad828bc","Type":"ContainerStarted","Data":"3eac7d79f6ff5424dfa03b1bdfb4a80f7008792eb0ac01a3399847952fb39221"} Dec 08 09:21:42 crc kubenswrapper[4776]: I1208 09:21:42.504685 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5tfbk" event={"ID":"215a9444-a545-491d-9eb6-02d98baff784","Type":"ContainerStarted","Data":"fee8648f0f31d628d3815308d57b84611aafb569bd1e2fde7fb6ccb605a656ec"} Dec 08 09:21:42 crc kubenswrapper[4776]: I1208 09:21:42.505511 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:42 crc kubenswrapper[4776]: I1208 09:21:42.505549 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:21:42 crc kubenswrapper[4776]: I1208 09:21:42.554722 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-5tfbk" podStartSLOduration=14.703220759 podStartE2EDuration="28.554699177s" podCreationTimestamp="2025-12-08 09:21:14 +0000 UTC" firstStartedPulling="2025-12-08 09:21:25.309273305 +0000 UTC m=+1361.572498327" lastFinishedPulling="2025-12-08 09:21:39.160751723 +0000 UTC m=+1375.423976745" observedRunningTime="2025-12-08 09:21:42.546919048 +0000 UTC m=+1378.810144090" watchObservedRunningTime="2025-12-08 09:21:42.554699177 +0000 UTC m=+1378.817924199" Dec 08 09:21:43 crc kubenswrapper[4776]: I1208 09:21:43.519263 4776 generic.go:334] "Generic (PLEG): container finished" podID="425d947a-2a85-4a03-853f-a60f54515a57" containerID="4eda23ffa9b631ab6ab393bf163d0e1e5b7305f07ecbc0e973543bf7905d3354" exitCode=0 Dec 08 09:21:43 crc kubenswrapper[4776]: I1208 09:21:43.519359 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"425d947a-2a85-4a03-853f-a60f54515a57","Type":"ContainerDied","Data":"4eda23ffa9b631ab6ab393bf163d0e1e5b7305f07ecbc0e973543bf7905d3354"} Dec 08 09:21:43 crc kubenswrapper[4776]: I1208 09:21:43.521998 4776 generic.go:334] "Generic (PLEG): container finished" podID="7df4120e-0e93-4000-8b6a-7823f3e89dac" containerID="5d6a3c8d703f5d364e074bda2aa877f975b3e861980b28ae0cf17a0781266ab2" exitCode=0 Dec 08 09:21:43 crc kubenswrapper[4776]: I1208 09:21:43.522064 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7df4120e-0e93-4000-8b6a-7823f3e89dac","Type":"ContainerDied","Data":"5d6a3c8d703f5d364e074bda2aa877f975b3e861980b28ae0cf17a0781266ab2"} Dec 08 09:21:43 crc kubenswrapper[4776]: E1208 09:21:43.666275 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="3d941bbc-2271-4ec4-853f-57feaf6ace36" Dec 08 09:21:44 crc kubenswrapper[4776]: I1208 09:21:44.535594 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7df4120e-0e93-4000-8b6a-7823f3e89dac","Type":"ContainerStarted","Data":"b648917d2070bc1c851e9190d2d5cd59abd38788f1b13e2c937a71be66b4a0d1"} Dec 08 09:21:44 crc kubenswrapper[4776]: I1208 09:21:44.539011 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-zn4qk" event={"ID":"e843ce72-b4b1-4603-8876-05dc121793ed","Type":"ContainerStarted","Data":"ec1b49ec2bdc4b7e6956b2868ec2bafef49f160b542dc5c61aa26f8410bd04b6"} Dec 08 09:21:44 crc kubenswrapper[4776]: I1208 09:21:44.543068 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341"} Dec 08 09:21:44 crc kubenswrapper[4776]: I1208 09:21:44.545481 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3d941bbc-2271-4ec4-853f-57feaf6ace36","Type":"ContainerStarted","Data":"d8cccac34d40a28601e68cd2e735b4f029149fc5bac29be04ebb2a56ef6d0a36"} Dec 08 09:21:44 crc kubenswrapper[4776]: E1208 09:21:44.547637 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="3d941bbc-2271-4ec4-853f-57feaf6ace36" Dec 08 09:21:44 crc kubenswrapper[4776]: I1208 09:21:44.548441 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0e4de746-d269-470c-b934-117aa4c73834","Type":"ContainerStarted","Data":"b631ce06b083dc6723ca8851d7cb2135c939b89bb64646278fed8097f4e12578"} Dec 08 09:21:44 crc kubenswrapper[4776]: I1208 09:21:44.550705 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"425d947a-2a85-4a03-853f-a60f54515a57","Type":"ContainerStarted","Data":"d35c06c154ae33ec11317d951d4060be152db3834851b0f5fd146c9126f38dca"} Dec 08 09:21:44 crc kubenswrapper[4776]: I1208 09:21:44.567925 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.051813733 podStartE2EDuration="39.567906064s" podCreationTimestamp="2025-12-08 09:21:05 +0000 UTC" firstStartedPulling="2025-12-08 09:21:24.644801867 +0000 UTC m=+1360.908026889" lastFinishedPulling="2025-12-08 09:21:39.160894208 +0000 UTC m=+1375.424119220" observedRunningTime="2025-12-08 09:21:44.555729227 +0000 UTC m=+1380.818954249" watchObservedRunningTime="2025-12-08 09:21:44.567906064 +0000 UTC m=+1380.831131076" Dec 08 09:21:44 crc kubenswrapper[4776]: I1208 09:21:44.594455 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.111076106 podStartE2EDuration="38.594433116s" podCreationTimestamp="2025-12-08 09:21:06 +0000 UTC" firstStartedPulling="2025-12-08 09:21:22.688562853 +0000 UTC m=+1358.951787875" lastFinishedPulling="2025-12-08 09:21:39.171919863 +0000 UTC m=+1375.435144885" observedRunningTime="2025-12-08 09:21:44.589129623 +0000 UTC m=+1380.852354655" watchObservedRunningTime="2025-12-08 09:21:44.594433116 +0000 UTC m=+1380.857658138" Dec 08 09:21:44 crc kubenswrapper[4776]: I1208 09:21:44.599954 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:44 crc kubenswrapper[4776]: I1208 09:21:44.600004 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:44 crc kubenswrapper[4776]: I1208 09:21:44.659334 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-zn4qk" podStartSLOduration=10.479129648 podStartE2EDuration="13.659307586s" podCreationTimestamp="2025-12-08 09:21:31 +0000 UTC" firstStartedPulling="2025-12-08 09:21:40.043332965 +0000 UTC m=+1376.306557977" lastFinishedPulling="2025-12-08 09:21:43.223510883 +0000 UTC m=+1379.486735915" observedRunningTime="2025-12-08 09:21:44.652486884 +0000 UTC m=+1380.915711906" watchObservedRunningTime="2025-12-08 09:21:44.659307586 +0000 UTC m=+1380.922532608" Dec 08 09:21:44 crc kubenswrapper[4776]: I1208 09:21:44.702556 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:44 crc kubenswrapper[4776]: I1208 09:21:44.728380 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.259117107 podStartE2EDuration="31.728362889s" podCreationTimestamp="2025-12-08 09:21:13 +0000 UTC" firstStartedPulling="2025-12-08 09:21:24.724019099 +0000 UTC m=+1360.987244121" lastFinishedPulling="2025-12-08 09:21:43.193264881 +0000 UTC m=+1379.456489903" observedRunningTime="2025-12-08 09:21:44.722014089 +0000 UTC m=+1380.985239111" watchObservedRunningTime="2025-12-08 09:21:44.728362889 +0000 UTC m=+1380.991587911" Dec 08 09:21:45 crc kubenswrapper[4776]: E1208 09:21:45.564730 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="3d941bbc-2271-4ec4-853f-57feaf6ace36" Dec 08 09:21:45 crc kubenswrapper[4776]: I1208 09:21:45.609155 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 08 09:21:46 crc kubenswrapper[4776]: I1208 09:21:46.488384 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 08 09:21:46 crc kubenswrapper[4776]: I1208 09:21:46.488699 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 08 09:21:47 crc kubenswrapper[4776]: I1208 09:21:47.344537 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:47 crc kubenswrapper[4776]: I1208 09:21:47.547358 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:21:47 crc kubenswrapper[4776]: I1208 09:21:47.607200 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-cc9cp"] Dec 08 09:21:47 crc kubenswrapper[4776]: I1208 09:21:47.607395 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" podUID="251ffea1-23ed-46e0-8683-0600c4176e26" containerName="dnsmasq-dns" containerID="cri-o://575fcb517f5f578510692787fa56c0d8c39d03ee4f201ceb1897c0b8046ba7f9" gracePeriod=10 Dec 08 09:21:47 crc kubenswrapper[4776]: I1208 09:21:47.934877 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:47 crc kubenswrapper[4776]: I1208 09:21:47.935250 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:48 crc kubenswrapper[4776]: I1208 09:21:48.586243 4776 generic.go:334] "Generic (PLEG): container finished" podID="251ffea1-23ed-46e0-8683-0600c4176e26" containerID="575fcb517f5f578510692787fa56c0d8c39d03ee4f201ceb1897c0b8046ba7f9" exitCode=0 Dec 08 09:21:48 crc kubenswrapper[4776]: I1208 09:21:48.586263 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" event={"ID":"251ffea1-23ed-46e0-8683-0600c4176e26","Type":"ContainerDied","Data":"575fcb517f5f578510692787fa56c0d8c39d03ee4f201ceb1897c0b8046ba7f9"} Dec 08 09:21:48 crc kubenswrapper[4776]: I1208 09:21:48.727653 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 08 09:21:48 crc kubenswrapper[4776]: I1208 09:21:48.816251 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.231470 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.376819 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgkwz\" (UniqueName: \"kubernetes.io/projected/251ffea1-23ed-46e0-8683-0600c4176e26-kube-api-access-rgkwz\") pod \"251ffea1-23ed-46e0-8683-0600c4176e26\" (UID: \"251ffea1-23ed-46e0-8683-0600c4176e26\") " Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.376976 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-ovsdbserver-nb\") pod \"251ffea1-23ed-46e0-8683-0600c4176e26\" (UID: \"251ffea1-23ed-46e0-8683-0600c4176e26\") " Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.377006 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-dns-svc\") pod \"251ffea1-23ed-46e0-8683-0600c4176e26\" (UID: \"251ffea1-23ed-46e0-8683-0600c4176e26\") " Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.377062 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-config\") pod \"251ffea1-23ed-46e0-8683-0600c4176e26\" (UID: \"251ffea1-23ed-46e0-8683-0600c4176e26\") " Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.397455 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/251ffea1-23ed-46e0-8683-0600c4176e26-kube-api-access-rgkwz" (OuterVolumeSpecName: "kube-api-access-rgkwz") pod "251ffea1-23ed-46e0-8683-0600c4176e26" (UID: "251ffea1-23ed-46e0-8683-0600c4176e26"). InnerVolumeSpecName "kube-api-access-rgkwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.479398 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgkwz\" (UniqueName: \"kubernetes.io/projected/251ffea1-23ed-46e0-8683-0600c4176e26-kube-api-access-rgkwz\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.491775 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-config" (OuterVolumeSpecName: "config") pod "251ffea1-23ed-46e0-8683-0600c4176e26" (UID: "251ffea1-23ed-46e0-8683-0600c4176e26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.497967 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "251ffea1-23ed-46e0-8683-0600c4176e26" (UID: "251ffea1-23ed-46e0-8683-0600c4176e26"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.503898 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "251ffea1-23ed-46e0-8683-0600c4176e26" (UID: "251ffea1-23ed-46e0-8683-0600c4176e26"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.581822 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.581857 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.581866 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/251ffea1-23ed-46e0-8683-0600c4176e26-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.595836 4776 generic.go:334] "Generic (PLEG): container finished" podID="786c0b37-638a-4b59-b149-628d9ad828bc" containerID="3eac7d79f6ff5424dfa03b1bdfb4a80f7008792eb0ac01a3399847952fb39221" exitCode=0 Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.595893 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"786c0b37-638a-4b59-b149-628d9ad828bc","Type":"ContainerDied","Data":"3eac7d79f6ff5424dfa03b1bdfb4a80f7008792eb0ac01a3399847952fb39221"} Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.598543 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" event={"ID":"251ffea1-23ed-46e0-8683-0600c4176e26","Type":"ContainerDied","Data":"8726fcd2390b27f9fee709f9039dc2ee0cff95fc170a8ad1202ad6294e71088c"} Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.598604 4776 scope.go:117] "RemoveContainer" containerID="575fcb517f5f578510692787fa56c0d8c39d03ee4f201ceb1897c0b8046ba7f9" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.598713 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-cc9cp" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.630530 4776 scope.go:117] "RemoveContainer" containerID="c6675fbaae0811d37f0acad25ef80f3136e33b7386042d01be84bd4f57677a89" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.653568 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-cc9cp"] Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.661398 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-cc9cp"] Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.889205 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-rfbws"] Dec 08 09:21:49 crc kubenswrapper[4776]: E1208 09:21:49.889979 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="251ffea1-23ed-46e0-8683-0600c4176e26" containerName="init" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.889999 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="251ffea1-23ed-46e0-8683-0600c4176e26" containerName="init" Dec 08 09:21:49 crc kubenswrapper[4776]: E1208 09:21:49.890030 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="251ffea1-23ed-46e0-8683-0600c4176e26" containerName="dnsmasq-dns" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.890057 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="251ffea1-23ed-46e0-8683-0600c4176e26" containerName="dnsmasq-dns" Dec 08 09:21:49 crc kubenswrapper[4776]: E1208 09:21:49.890072 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="408d75b6-d5bf-4dfe-8a22-7ba887229cac" containerName="dnsmasq-dns" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.890079 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="408d75b6-d5bf-4dfe-8a22-7ba887229cac" containerName="dnsmasq-dns" Dec 08 09:21:49 crc kubenswrapper[4776]: E1208 09:21:49.890103 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="639910a7-1d35-4535-b629-18fe52dacac3" containerName="dnsmasq-dns" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.890110 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="639910a7-1d35-4535-b629-18fe52dacac3" containerName="dnsmasq-dns" Dec 08 09:21:49 crc kubenswrapper[4776]: E1208 09:21:49.890154 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="408d75b6-d5bf-4dfe-8a22-7ba887229cac" containerName="init" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.890161 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="408d75b6-d5bf-4dfe-8a22-7ba887229cac" containerName="init" Dec 08 09:21:49 crc kubenswrapper[4776]: E1208 09:21:49.890186 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="639910a7-1d35-4535-b629-18fe52dacac3" containerName="init" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.890192 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="639910a7-1d35-4535-b629-18fe52dacac3" containerName="init" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.890895 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="251ffea1-23ed-46e0-8683-0600c4176e26" containerName="dnsmasq-dns" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.890945 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="408d75b6-d5bf-4dfe-8a22-7ba887229cac" containerName="dnsmasq-dns" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.890972 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="639910a7-1d35-4535-b629-18fe52dacac3" containerName="dnsmasq-dns" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.891928 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-rfbws" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.925214 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-rfbws"] Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.970302 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-3e4f-account-create-update-27jsm"] Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.972232 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-3e4f-account-create-update-27jsm" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.978414 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.990950 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-3e4f-account-create-update-27jsm"] Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.994771 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81ff1f5b-522f-4e63-84b6-2462e19419e7-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-rfbws\" (UID: \"81ff1f5b-522f-4e63-84b6-2462e19419e7\") " pod="openstack/mysqld-exporter-openstack-db-create-rfbws" Dec 08 09:21:49 crc kubenswrapper[4776]: I1208 09:21:49.994887 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkbzw\" (UniqueName: \"kubernetes.io/projected/81ff1f5b-522f-4e63-84b6-2462e19419e7-kube-api-access-xkbzw\") pod \"mysqld-exporter-openstack-db-create-rfbws\" (UID: \"81ff1f5b-522f-4e63-84b6-2462e19419e7\") " pod="openstack/mysqld-exporter-openstack-db-create-rfbws" Dec 08 09:21:50 crc kubenswrapper[4776]: I1208 09:21:50.033186 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 08 09:21:50 crc kubenswrapper[4776]: I1208 09:21:50.096804 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81ff1f5b-522f-4e63-84b6-2462e19419e7-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-rfbws\" (UID: \"81ff1f5b-522f-4e63-84b6-2462e19419e7\") " pod="openstack/mysqld-exporter-openstack-db-create-rfbws" Dec 08 09:21:50 crc kubenswrapper[4776]: I1208 09:21:50.097051 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpt29\" (UniqueName: \"kubernetes.io/projected/285be34f-c8bd-45c4-9e7c-da900aaa3fd2-kube-api-access-jpt29\") pod \"mysqld-exporter-3e4f-account-create-update-27jsm\" (UID: \"285be34f-c8bd-45c4-9e7c-da900aaa3fd2\") " pod="openstack/mysqld-exporter-3e4f-account-create-update-27jsm" Dec 08 09:21:50 crc kubenswrapper[4776]: I1208 09:21:50.097146 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/285be34f-c8bd-45c4-9e7c-da900aaa3fd2-operator-scripts\") pod \"mysqld-exporter-3e4f-account-create-update-27jsm\" (UID: \"285be34f-c8bd-45c4-9e7c-da900aaa3fd2\") " pod="openstack/mysqld-exporter-3e4f-account-create-update-27jsm" Dec 08 09:21:50 crc kubenswrapper[4776]: I1208 09:21:50.097254 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkbzw\" (UniqueName: \"kubernetes.io/projected/81ff1f5b-522f-4e63-84b6-2462e19419e7-kube-api-access-xkbzw\") pod \"mysqld-exporter-openstack-db-create-rfbws\" (UID: \"81ff1f5b-522f-4e63-84b6-2462e19419e7\") " pod="openstack/mysqld-exporter-openstack-db-create-rfbws" Dec 08 09:21:50 crc kubenswrapper[4776]: I1208 09:21:50.098875 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81ff1f5b-522f-4e63-84b6-2462e19419e7-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-rfbws\" (UID: \"81ff1f5b-522f-4e63-84b6-2462e19419e7\") " pod="openstack/mysqld-exporter-openstack-db-create-rfbws" Dec 08 09:21:50 crc kubenswrapper[4776]: I1208 09:21:50.122194 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkbzw\" (UniqueName: \"kubernetes.io/projected/81ff1f5b-522f-4e63-84b6-2462e19419e7-kube-api-access-xkbzw\") pod \"mysqld-exporter-openstack-db-create-rfbws\" (UID: \"81ff1f5b-522f-4e63-84b6-2462e19419e7\") " pod="openstack/mysqld-exporter-openstack-db-create-rfbws" Dec 08 09:21:50 crc kubenswrapper[4776]: I1208 09:21:50.199995 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpt29\" (UniqueName: \"kubernetes.io/projected/285be34f-c8bd-45c4-9e7c-da900aaa3fd2-kube-api-access-jpt29\") pod \"mysqld-exporter-3e4f-account-create-update-27jsm\" (UID: \"285be34f-c8bd-45c4-9e7c-da900aaa3fd2\") " pod="openstack/mysqld-exporter-3e4f-account-create-update-27jsm" Dec 08 09:21:50 crc kubenswrapper[4776]: I1208 09:21:50.200093 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/285be34f-c8bd-45c4-9e7c-da900aaa3fd2-operator-scripts\") pod \"mysqld-exporter-3e4f-account-create-update-27jsm\" (UID: \"285be34f-c8bd-45c4-9e7c-da900aaa3fd2\") " pod="openstack/mysqld-exporter-3e4f-account-create-update-27jsm" Dec 08 09:21:50 crc kubenswrapper[4776]: I1208 09:21:50.201067 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/285be34f-c8bd-45c4-9e7c-da900aaa3fd2-operator-scripts\") pod \"mysqld-exporter-3e4f-account-create-update-27jsm\" (UID: \"285be34f-c8bd-45c4-9e7c-da900aaa3fd2\") " pod="openstack/mysqld-exporter-3e4f-account-create-update-27jsm" Dec 08 09:21:50 crc kubenswrapper[4776]: I1208 09:21:50.219578 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpt29\" (UniqueName: \"kubernetes.io/projected/285be34f-c8bd-45c4-9e7c-da900aaa3fd2-kube-api-access-jpt29\") pod \"mysqld-exporter-3e4f-account-create-update-27jsm\" (UID: \"285be34f-c8bd-45c4-9e7c-da900aaa3fd2\") " pod="openstack/mysqld-exporter-3e4f-account-create-update-27jsm" Dec 08 09:21:50 crc kubenswrapper[4776]: I1208 09:21:50.228617 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-rfbws" Dec 08 09:21:50 crc kubenswrapper[4776]: I1208 09:21:50.289988 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-3e4f-account-create-update-27jsm" Dec 08 09:21:50 crc kubenswrapper[4776]: I1208 09:21:50.362680 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="251ffea1-23ed-46e0-8683-0600c4176e26" path="/var/lib/kubelet/pods/251ffea1-23ed-46e0-8683-0600c4176e26/volumes" Dec 08 09:21:50 crc kubenswrapper[4776]: I1208 09:21:50.693534 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-rfbws"] Dec 08 09:21:50 crc kubenswrapper[4776]: I1208 09:21:50.912803 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-3e4f-account-create-update-27jsm"] Dec 08 09:21:51 crc kubenswrapper[4776]: I1208 09:21:51.639240 4776 generic.go:334] "Generic (PLEG): container finished" podID="285be34f-c8bd-45c4-9e7c-da900aaa3fd2" containerID="79235d51ab147737e701d48cff1bddb761b498511a609c2fdd70ef52f335b620" exitCode=0 Dec 08 09:21:51 crc kubenswrapper[4776]: I1208 09:21:51.639308 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-3e4f-account-create-update-27jsm" event={"ID":"285be34f-c8bd-45c4-9e7c-da900aaa3fd2","Type":"ContainerDied","Data":"79235d51ab147737e701d48cff1bddb761b498511a609c2fdd70ef52f335b620"} Dec 08 09:21:51 crc kubenswrapper[4776]: I1208 09:21:51.639789 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-3e4f-account-create-update-27jsm" event={"ID":"285be34f-c8bd-45c4-9e7c-da900aaa3fd2","Type":"ContainerStarted","Data":"6e18116cb5c7f3f16ab156c7632aae2db5750a06f752bcae5a1c86646dc26426"} Dec 08 09:21:51 crc kubenswrapper[4776]: I1208 09:21:51.642655 4776 generic.go:334] "Generic (PLEG): container finished" podID="81ff1f5b-522f-4e63-84b6-2462e19419e7" containerID="418601c2d5742bbf0373b37729b1a080a517f3ed1ef2a0d4bfcbd4ec1ad1d0c8" exitCode=0 Dec 08 09:21:51 crc kubenswrapper[4776]: I1208 09:21:51.642703 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-rfbws" event={"ID":"81ff1f5b-522f-4e63-84b6-2462e19419e7","Type":"ContainerDied","Data":"418601c2d5742bbf0373b37729b1a080a517f3ed1ef2a0d4bfcbd4ec1ad1d0c8"} Dec 08 09:21:51 crc kubenswrapper[4776]: I1208 09:21:51.642738 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-rfbws" event={"ID":"81ff1f5b-522f-4e63-84b6-2462e19419e7","Type":"ContainerStarted","Data":"75723d29c0416ead1e74746cb4ee249dd4c2d1efe6dae640ea49b2ceeaee0c9d"} Dec 08 09:21:52 crc kubenswrapper[4776]: I1208 09:21:52.042347 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:52 crc kubenswrapper[4776]: I1208 09:21:52.138960 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 08 09:21:52 crc kubenswrapper[4776]: I1208 09:21:52.803213 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w8l8r"] Dec 08 09:21:52 crc kubenswrapper[4776]: I1208 09:21:52.806828 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:21:52 crc kubenswrapper[4776]: I1208 09:21:52.834030 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w8l8r"] Dec 08 09:21:52 crc kubenswrapper[4776]: I1208 09:21:52.915237 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc47007-7b1d-458c-b1ee-f561fff88bd7-utilities\") pod \"redhat-operators-w8l8r\" (UID: \"8fc47007-7b1d-458c-b1ee-f561fff88bd7\") " pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:21:52 crc kubenswrapper[4776]: I1208 09:21:52.915334 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc47007-7b1d-458c-b1ee-f561fff88bd7-catalog-content\") pod \"redhat-operators-w8l8r\" (UID: \"8fc47007-7b1d-458c-b1ee-f561fff88bd7\") " pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:21:52 crc kubenswrapper[4776]: I1208 09:21:52.915369 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srklq\" (UniqueName: \"kubernetes.io/projected/8fc47007-7b1d-458c-b1ee-f561fff88bd7-kube-api-access-srklq\") pod \"redhat-operators-w8l8r\" (UID: \"8fc47007-7b1d-458c-b1ee-f561fff88bd7\") " pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:21:53 crc kubenswrapper[4776]: I1208 09:21:53.017193 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc47007-7b1d-458c-b1ee-f561fff88bd7-utilities\") pod \"redhat-operators-w8l8r\" (UID: \"8fc47007-7b1d-458c-b1ee-f561fff88bd7\") " pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:21:53 crc kubenswrapper[4776]: I1208 09:21:53.017428 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc47007-7b1d-458c-b1ee-f561fff88bd7-catalog-content\") pod \"redhat-operators-w8l8r\" (UID: \"8fc47007-7b1d-458c-b1ee-f561fff88bd7\") " pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:21:53 crc kubenswrapper[4776]: I1208 09:21:53.017454 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srklq\" (UniqueName: \"kubernetes.io/projected/8fc47007-7b1d-458c-b1ee-f561fff88bd7-kube-api-access-srklq\") pod \"redhat-operators-w8l8r\" (UID: \"8fc47007-7b1d-458c-b1ee-f561fff88bd7\") " pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:21:53 crc kubenswrapper[4776]: I1208 09:21:53.017714 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc47007-7b1d-458c-b1ee-f561fff88bd7-utilities\") pod \"redhat-operators-w8l8r\" (UID: \"8fc47007-7b1d-458c-b1ee-f561fff88bd7\") " pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:21:53 crc kubenswrapper[4776]: I1208 09:21:53.017782 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc47007-7b1d-458c-b1ee-f561fff88bd7-catalog-content\") pod \"redhat-operators-w8l8r\" (UID: \"8fc47007-7b1d-458c-b1ee-f561fff88bd7\") " pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:21:53 crc kubenswrapper[4776]: I1208 09:21:53.038094 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srklq\" (UniqueName: \"kubernetes.io/projected/8fc47007-7b1d-458c-b1ee-f561fff88bd7-kube-api-access-srklq\") pod \"redhat-operators-w8l8r\" (UID: \"8fc47007-7b1d-458c-b1ee-f561fff88bd7\") " pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:21:53 crc kubenswrapper[4776]: I1208 09:21:53.129495 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-rfbws" Dec 08 09:21:53 crc kubenswrapper[4776]: I1208 09:21:53.175231 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:21:53 crc kubenswrapper[4776]: I1208 09:21:53.223970 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkbzw\" (UniqueName: \"kubernetes.io/projected/81ff1f5b-522f-4e63-84b6-2462e19419e7-kube-api-access-xkbzw\") pod \"81ff1f5b-522f-4e63-84b6-2462e19419e7\" (UID: \"81ff1f5b-522f-4e63-84b6-2462e19419e7\") " Dec 08 09:21:53 crc kubenswrapper[4776]: I1208 09:21:53.224374 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81ff1f5b-522f-4e63-84b6-2462e19419e7-operator-scripts\") pod \"81ff1f5b-522f-4e63-84b6-2462e19419e7\" (UID: \"81ff1f5b-522f-4e63-84b6-2462e19419e7\") " Dec 08 09:21:53 crc kubenswrapper[4776]: I1208 09:21:53.225410 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81ff1f5b-522f-4e63-84b6-2462e19419e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81ff1f5b-522f-4e63-84b6-2462e19419e7" (UID: "81ff1f5b-522f-4e63-84b6-2462e19419e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:21:53 crc kubenswrapper[4776]: I1208 09:21:53.231330 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ff1f5b-522f-4e63-84b6-2462e19419e7-kube-api-access-xkbzw" (OuterVolumeSpecName: "kube-api-access-xkbzw") pod "81ff1f5b-522f-4e63-84b6-2462e19419e7" (UID: "81ff1f5b-522f-4e63-84b6-2462e19419e7"). InnerVolumeSpecName "kube-api-access-xkbzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:21:53 crc kubenswrapper[4776]: I1208 09:21:53.262520 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-3e4f-account-create-update-27jsm" Dec 08 09:21:53 crc kubenswrapper[4776]: I1208 09:21:53.329231 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81ff1f5b-522f-4e63-84b6-2462e19419e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:53 crc kubenswrapper[4776]: I1208 09:21:53.329256 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkbzw\" (UniqueName: \"kubernetes.io/projected/81ff1f5b-522f-4e63-84b6-2462e19419e7-kube-api-access-xkbzw\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:53.431114 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpt29\" (UniqueName: \"kubernetes.io/projected/285be34f-c8bd-45c4-9e7c-da900aaa3fd2-kube-api-access-jpt29\") pod \"285be34f-c8bd-45c4-9e7c-da900aaa3fd2\" (UID: \"285be34f-c8bd-45c4-9e7c-da900aaa3fd2\") " Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:53.431600 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/285be34f-c8bd-45c4-9e7c-da900aaa3fd2-operator-scripts\") pod \"285be34f-c8bd-45c4-9e7c-da900aaa3fd2\" (UID: \"285be34f-c8bd-45c4-9e7c-da900aaa3fd2\") " Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:53.438426 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/285be34f-c8bd-45c4-9e7c-da900aaa3fd2-kube-api-access-jpt29" (OuterVolumeSpecName: "kube-api-access-jpt29") pod "285be34f-c8bd-45c4-9e7c-da900aaa3fd2" (UID: "285be34f-c8bd-45c4-9e7c-da900aaa3fd2"). InnerVolumeSpecName "kube-api-access-jpt29". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:53.456769 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/285be34f-c8bd-45c4-9e7c-da900aaa3fd2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "285be34f-c8bd-45c4-9e7c-da900aaa3fd2" (UID: "285be34f-c8bd-45c4-9e7c-da900aaa3fd2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:53.535451 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpt29\" (UniqueName: \"kubernetes.io/projected/285be34f-c8bd-45c4-9e7c-da900aaa3fd2-kube-api-access-jpt29\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:53.535496 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/285be34f-c8bd-45c4-9e7c-da900aaa3fd2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:53.664775 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-rfbws" event={"ID":"81ff1f5b-522f-4e63-84b6-2462e19419e7","Type":"ContainerDied","Data":"75723d29c0416ead1e74746cb4ee249dd4c2d1efe6dae640ea49b2ceeaee0c9d"} Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:53.664825 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75723d29c0416ead1e74746cb4ee249dd4c2d1efe6dae640ea49b2ceeaee0c9d" Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:53.664906 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-rfbws" Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:53.671995 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-3e4f-account-create-update-27jsm" event={"ID":"285be34f-c8bd-45c4-9e7c-da900aaa3fd2","Type":"ContainerDied","Data":"6e18116cb5c7f3f16ab156c7632aae2db5750a06f752bcae5a1c86646dc26426"} Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:53.672042 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e18116cb5c7f3f16ab156c7632aae2db5750a06f752bcae5a1c86646dc26426" Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:53.672107 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-3e4f-account-create-update-27jsm" Dec 08 09:21:54 crc kubenswrapper[4776]: W1208 09:21:53.729137 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fc47007_7b1d_458c_b1ee_f561fff88bd7.slice/crio-f6b0583fa8f49dd932079f7c7ffd124c5e663077f826e6ebd2d7a5cc2cc5f0fe WatchSource:0}: Error finding container f6b0583fa8f49dd932079f7c7ffd124c5e663077f826e6ebd2d7a5cc2cc5f0fe: Status 404 returned error can't find the container with id f6b0583fa8f49dd932079f7c7ffd124c5e663077f826e6ebd2d7a5cc2cc5f0fe Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:53.738494 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w8l8r"] Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:54.684272 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpgmk" event={"ID":"9a9a1b68-ec7e-4994-9bda-fd418747dbc5","Type":"ContainerStarted","Data":"bca15b30a398e4a0fa64ef918fcb9485bfde50433c53539967b123c5d573c122"} Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:54.685891 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-wpgmk" Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:54.687970 4776 generic.go:334] "Generic (PLEG): container finished" podID="8fc47007-7b1d-458c-b1ee-f561fff88bd7" containerID="71bf9bcd196e3a24f9f5769f157fb8a856da68c96d13c89c446ba0afcf74d052" exitCode=0 Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:54.688006 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8l8r" event={"ID":"8fc47007-7b1d-458c-b1ee-f561fff88bd7","Type":"ContainerDied","Data":"71bf9bcd196e3a24f9f5769f157fb8a856da68c96d13c89c446ba0afcf74d052"} Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:54.688024 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8l8r" event={"ID":"8fc47007-7b1d-458c-b1ee-f561fff88bd7","Type":"ContainerStarted","Data":"f6b0583fa8f49dd932079f7c7ffd124c5e663077f826e6ebd2d7a5cc2cc5f0fe"} Dec 08 09:21:54 crc kubenswrapper[4776]: I1208 09:21:54.718057 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wpgmk" podStartSLOduration=11.588628477 podStartE2EDuration="40.718036714s" podCreationTimestamp="2025-12-08 09:21:14 +0000 UTC" firstStartedPulling="2025-12-08 09:21:24.724039439 +0000 UTC m=+1360.987264461" lastFinishedPulling="2025-12-08 09:21:53.853447676 +0000 UTC m=+1390.116672698" observedRunningTime="2025-12-08 09:21:54.703674019 +0000 UTC m=+1390.966899041" watchObservedRunningTime="2025-12-08 09:21:54.718036714 +0000 UTC m=+1390.981261736" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.249465 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp"] Dec 08 09:21:55 crc kubenswrapper[4776]: E1208 09:21:55.249869 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ff1f5b-522f-4e63-84b6-2462e19419e7" containerName="mariadb-database-create" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.249884 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ff1f5b-522f-4e63-84b6-2462e19419e7" containerName="mariadb-database-create" Dec 08 09:21:55 crc kubenswrapper[4776]: E1208 09:21:55.249931 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="285be34f-c8bd-45c4-9e7c-da900aaa3fd2" containerName="mariadb-account-create-update" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.249939 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="285be34f-c8bd-45c4-9e7c-da900aaa3fd2" containerName="mariadb-account-create-update" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.250129 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ff1f5b-522f-4e63-84b6-2462e19419e7" containerName="mariadb-database-create" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.250145 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="285be34f-c8bd-45c4-9e7c-da900aaa3fd2" containerName="mariadb-account-create-update" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.250930 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.260601 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp"] Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.299472 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c45c98e-e12c-404f-8684-6d34481e2cee-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-ps8vp\" (UID: \"5c45c98e-e12c-404f-8684-6d34481e2cee\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.300168 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7knf\" (UniqueName: \"kubernetes.io/projected/5c45c98e-e12c-404f-8684-6d34481e2cee-kube-api-access-w7knf\") pod \"mysqld-exporter-openstack-cell1-db-create-ps8vp\" (UID: \"5c45c98e-e12c-404f-8684-6d34481e2cee\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.402600 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7knf\" (UniqueName: \"kubernetes.io/projected/5c45c98e-e12c-404f-8684-6d34481e2cee-kube-api-access-w7knf\") pod \"mysqld-exporter-openstack-cell1-db-create-ps8vp\" (UID: \"5c45c98e-e12c-404f-8684-6d34481e2cee\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.403323 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c45c98e-e12c-404f-8684-6d34481e2cee-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-ps8vp\" (UID: \"5c45c98e-e12c-404f-8684-6d34481e2cee\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.404023 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c45c98e-e12c-404f-8684-6d34481e2cee-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-ps8vp\" (UID: \"5c45c98e-e12c-404f-8684-6d34481e2cee\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.436153 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7knf\" (UniqueName: \"kubernetes.io/projected/5c45c98e-e12c-404f-8684-6d34481e2cee-kube-api-access-w7knf\") pod \"mysqld-exporter-openstack-cell1-db-create-ps8vp\" (UID: \"5c45c98e-e12c-404f-8684-6d34481e2cee\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.477287 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-827c-account-create-update-2zj8b"] Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.478937 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-827c-account-create-update-2zj8b" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.481556 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.491807 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-827c-account-create-update-2zj8b"] Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.514531 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbsx7\" (UniqueName: \"kubernetes.io/projected/0920722c-0e43-40c4-8c74-63d6bbb9b419-kube-api-access-bbsx7\") pod \"mysqld-exporter-827c-account-create-update-2zj8b\" (UID: \"0920722c-0e43-40c4-8c74-63d6bbb9b419\") " pod="openstack/mysqld-exporter-827c-account-create-update-2zj8b" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.515102 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0920722c-0e43-40c4-8c74-63d6bbb9b419-operator-scripts\") pod \"mysqld-exporter-827c-account-create-update-2zj8b\" (UID: \"0920722c-0e43-40c4-8c74-63d6bbb9b419\") " pod="openstack/mysqld-exporter-827c-account-create-update-2zj8b" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.576861 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.616468 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0920722c-0e43-40c4-8c74-63d6bbb9b419-operator-scripts\") pod \"mysqld-exporter-827c-account-create-update-2zj8b\" (UID: \"0920722c-0e43-40c4-8c74-63d6bbb9b419\") " pod="openstack/mysqld-exporter-827c-account-create-update-2zj8b" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.616521 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbsx7\" (UniqueName: \"kubernetes.io/projected/0920722c-0e43-40c4-8c74-63d6bbb9b419-kube-api-access-bbsx7\") pod \"mysqld-exporter-827c-account-create-update-2zj8b\" (UID: \"0920722c-0e43-40c4-8c74-63d6bbb9b419\") " pod="openstack/mysqld-exporter-827c-account-create-update-2zj8b" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.617226 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0920722c-0e43-40c4-8c74-63d6bbb9b419-operator-scripts\") pod \"mysqld-exporter-827c-account-create-update-2zj8b\" (UID: \"0920722c-0e43-40c4-8c74-63d6bbb9b419\") " pod="openstack/mysqld-exporter-827c-account-create-update-2zj8b" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.632354 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbsx7\" (UniqueName: \"kubernetes.io/projected/0920722c-0e43-40c4-8c74-63d6bbb9b419-kube-api-access-bbsx7\") pod \"mysqld-exporter-827c-account-create-update-2zj8b\" (UID: \"0920722c-0e43-40c4-8c74-63d6bbb9b419\") " pod="openstack/mysqld-exporter-827c-account-create-update-2zj8b" Dec 08 09:21:55 crc kubenswrapper[4776]: I1208 09:21:55.817242 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-827c-account-create-update-2zj8b" Dec 08 09:21:56 crc kubenswrapper[4776]: I1208 09:21:56.428576 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-588757d595-b54s9" podUID="f6afc1e0-d554-4135-a114-4cc735150c43" containerName="console" containerID="cri-o://32928cbe8a4b53150016bc031db50b46bfbdeedf8ebddfdde86638d7d5774545" gracePeriod=15 Dec 08 09:21:56 crc kubenswrapper[4776]: I1208 09:21:56.460418 4776 patch_prober.go:28] interesting pod/console-588757d595-b54s9 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.90:8443/health\": dial tcp 10.217.0.90:8443: connect: connection refused" start-of-body= Dec 08 09:21:56 crc kubenswrapper[4776]: I1208 09:21:56.460534 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-588757d595-b54s9" podUID="f6afc1e0-d554-4135-a114-4cc735150c43" containerName="console" probeResult="failure" output="Get \"https://10.217.0.90:8443/health\": dial tcp 10.217.0.90:8443: connect: connection refused" Dec 08 09:21:56 crc kubenswrapper[4776]: I1208 09:21:56.712839 4776 generic.go:334] "Generic (PLEG): container finished" podID="a01574f0-d8c8-404a-b822-7ce8e0af6fd4" containerID="b6fb7c3067a1dc9a57114d5f89cfbc05711c41bf27c557af79d1dbb9fcb89acd" exitCode=0 Dec 08 09:21:56 crc kubenswrapper[4776]: I1208 09:21:56.712907 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a01574f0-d8c8-404a-b822-7ce8e0af6fd4","Type":"ContainerDied","Data":"b6fb7c3067a1dc9a57114d5f89cfbc05711c41bf27c557af79d1dbb9fcb89acd"} Dec 08 09:21:56 crc kubenswrapper[4776]: I1208 09:21:56.717761 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-588757d595-b54s9_f6afc1e0-d554-4135-a114-4cc735150c43/console/0.log" Dec 08 09:21:56 crc kubenswrapper[4776]: I1208 09:21:56.717803 4776 generic.go:334] "Generic (PLEG): container finished" podID="f6afc1e0-d554-4135-a114-4cc735150c43" containerID="32928cbe8a4b53150016bc031db50b46bfbdeedf8ebddfdde86638d7d5774545" exitCode=2 Dec 08 09:21:56 crc kubenswrapper[4776]: I1208 09:21:56.717828 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-588757d595-b54s9" event={"ID":"f6afc1e0-d554-4135-a114-4cc735150c43","Type":"ContainerDied","Data":"32928cbe8a4b53150016bc031db50b46bfbdeedf8ebddfdde86638d7d5774545"} Dec 08 09:21:57 crc kubenswrapper[4776]: I1208 09:21:57.730596 4776 generic.go:334] "Generic (PLEG): container finished" podID="bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" containerID="4c055d7aed43594abdf15af1327c7f77c2e5b1d61e62e8c3406e155a3f4672e3" exitCode=0 Dec 08 09:21:57 crc kubenswrapper[4776]: I1208 09:21:57.730669 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994","Type":"ContainerDied","Data":"4c055d7aed43594abdf15af1327c7f77c2e5b1d61e62e8c3406e155a3f4672e3"} Dec 08 09:21:57 crc kubenswrapper[4776]: I1208 09:21:57.829023 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-ww8jk"] Dec 08 09:21:57 crc kubenswrapper[4776]: I1208 09:21:57.830526 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ww8jk" Dec 08 09:21:57 crc kubenswrapper[4776]: I1208 09:21:57.836763 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ww8jk"] Dec 08 09:21:57 crc kubenswrapper[4776]: I1208 09:21:57.864861 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746-operator-scripts\") pod \"keystone-db-create-ww8jk\" (UID: \"e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746\") " pod="openstack/keystone-db-create-ww8jk" Dec 08 09:21:57 crc kubenswrapper[4776]: I1208 09:21:57.864939 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djplb\" (UniqueName: \"kubernetes.io/projected/e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746-kube-api-access-djplb\") pod \"keystone-db-create-ww8jk\" (UID: \"e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746\") " pod="openstack/keystone-db-create-ww8jk" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.015910 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746-operator-scripts\") pod \"keystone-db-create-ww8jk\" (UID: \"e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746\") " pod="openstack/keystone-db-create-ww8jk" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.016597 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djplb\" (UniqueName: \"kubernetes.io/projected/e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746-kube-api-access-djplb\") pod \"keystone-db-create-ww8jk\" (UID: \"e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746\") " pod="openstack/keystone-db-create-ww8jk" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.034742 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746-operator-scripts\") pod \"keystone-db-create-ww8jk\" (UID: \"e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746\") " pod="openstack/keystone-db-create-ww8jk" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.045902 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djplb\" (UniqueName: \"kubernetes.io/projected/e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746-kube-api-access-djplb\") pod \"keystone-db-create-ww8jk\" (UID: \"e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746\") " pod="openstack/keystone-db-create-ww8jk" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.052012 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3646-account-create-update-9fgcf"] Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.053388 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3646-account-create-update-9fgcf" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.055949 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.074371 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3646-account-create-update-9fgcf"] Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.089664 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ww8jk" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.132119 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf-operator-scripts\") pod \"keystone-3646-account-create-update-9fgcf\" (UID: \"de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf\") " pod="openstack/keystone-3646-account-create-update-9fgcf" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.132283 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp2sf\" (UniqueName: \"kubernetes.io/projected/de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf-kube-api-access-vp2sf\") pod \"keystone-3646-account-create-update-9fgcf\" (UID: \"de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf\") " pod="openstack/keystone-3646-account-create-update-9fgcf" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.168979 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-xlh2r"] Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.170518 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xlh2r" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.204233 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xlh2r"] Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.238503 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf-operator-scripts\") pod \"keystone-3646-account-create-update-9fgcf\" (UID: \"de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf\") " pod="openstack/keystone-3646-account-create-update-9fgcf" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.238573 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp2sf\" (UniqueName: \"kubernetes.io/projected/de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf-kube-api-access-vp2sf\") pod \"keystone-3646-account-create-update-9fgcf\" (UID: \"de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf\") " pod="openstack/keystone-3646-account-create-update-9fgcf" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.239791 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf-operator-scripts\") pod \"keystone-3646-account-create-update-9fgcf\" (UID: \"de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf\") " pod="openstack/keystone-3646-account-create-update-9fgcf" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.268805 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-588757d595-b54s9_f6afc1e0-d554-4135-a114-4cc735150c43/console/0.log" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.268866 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-588757d595-b54s9" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.277629 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7b88-account-create-update-kxcml"] Dec 08 09:21:58 crc kubenswrapper[4776]: E1208 09:21:58.278046 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6afc1e0-d554-4135-a114-4cc735150c43" containerName="console" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.278062 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6afc1e0-d554-4135-a114-4cc735150c43" containerName="console" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.278231 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6afc1e0-d554-4135-a114-4cc735150c43" containerName="console" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.281603 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b88-account-create-update-kxcml" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.283649 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.286814 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp2sf\" (UniqueName: \"kubernetes.io/projected/de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf-kube-api-access-vp2sf\") pod \"keystone-3646-account-create-update-9fgcf\" (UID: \"de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf\") " pod="openstack/keystone-3646-account-create-update-9fgcf" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.304209 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7b88-account-create-update-kxcml"] Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.363038 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/189aea5a-3eb2-41b7-9431-21a0acf13db7-operator-scripts\") pod \"placement-db-create-xlh2r\" (UID: \"189aea5a-3eb2-41b7-9431-21a0acf13db7\") " pod="openstack/placement-db-create-xlh2r" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.363519 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvfjc\" (UniqueName: \"kubernetes.io/projected/189aea5a-3eb2-41b7-9431-21a0acf13db7-kube-api-access-bvfjc\") pod \"placement-db-create-xlh2r\" (UID: \"189aea5a-3eb2-41b7-9431-21a0acf13db7\") " pod="openstack/placement-db-create-xlh2r" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.423269 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3646-account-create-update-9fgcf" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.468483 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-console-config\") pod \"f6afc1e0-d554-4135-a114-4cc735150c43\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.468529 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-trusted-ca-bundle\") pod \"f6afc1e0-d554-4135-a114-4cc735150c43\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.468639 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6afc1e0-d554-4135-a114-4cc735150c43-console-oauth-config\") pod \"f6afc1e0-d554-4135-a114-4cc735150c43\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.468684 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-oauth-serving-cert\") pod \"f6afc1e0-d554-4135-a114-4cc735150c43\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.468725 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm7nd\" (UniqueName: \"kubernetes.io/projected/f6afc1e0-d554-4135-a114-4cc735150c43-kube-api-access-cm7nd\") pod \"f6afc1e0-d554-4135-a114-4cc735150c43\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.468792 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6afc1e0-d554-4135-a114-4cc735150c43-console-serving-cert\") pod \"f6afc1e0-d554-4135-a114-4cc735150c43\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.468885 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-service-ca\") pod \"f6afc1e0-d554-4135-a114-4cc735150c43\" (UID: \"f6afc1e0-d554-4135-a114-4cc735150c43\") " Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.469206 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76ca76af-146b-4bb6-b676-1f9c8fd7f512-operator-scripts\") pod \"placement-7b88-account-create-update-kxcml\" (UID: \"76ca76af-146b-4bb6-b676-1f9c8fd7f512\") " pod="openstack/placement-7b88-account-create-update-kxcml" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.469327 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvfjc\" (UniqueName: \"kubernetes.io/projected/189aea5a-3eb2-41b7-9431-21a0acf13db7-kube-api-access-bvfjc\") pod \"placement-db-create-xlh2r\" (UID: \"189aea5a-3eb2-41b7-9431-21a0acf13db7\") " pod="openstack/placement-db-create-xlh2r" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.469404 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5sz9\" (UniqueName: \"kubernetes.io/projected/76ca76af-146b-4bb6-b676-1f9c8fd7f512-kube-api-access-d5sz9\") pod \"placement-7b88-account-create-update-kxcml\" (UID: \"76ca76af-146b-4bb6-b676-1f9c8fd7f512\") " pod="openstack/placement-7b88-account-create-update-kxcml" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.469461 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/189aea5a-3eb2-41b7-9431-21a0acf13db7-operator-scripts\") pod \"placement-db-create-xlh2r\" (UID: \"189aea5a-3eb2-41b7-9431-21a0acf13db7\") " pod="openstack/placement-db-create-xlh2r" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.470089 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/189aea5a-3eb2-41b7-9431-21a0acf13db7-operator-scripts\") pod \"placement-db-create-xlh2r\" (UID: \"189aea5a-3eb2-41b7-9431-21a0acf13db7\") " pod="openstack/placement-db-create-xlh2r" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.470679 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f6afc1e0-d554-4135-a114-4cc735150c43" (UID: "f6afc1e0-d554-4135-a114-4cc735150c43"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.471503 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-console-config" (OuterVolumeSpecName: "console-config") pod "f6afc1e0-d554-4135-a114-4cc735150c43" (UID: "f6afc1e0-d554-4135-a114-4cc735150c43"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.472378 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-service-ca" (OuterVolumeSpecName: "service-ca") pod "f6afc1e0-d554-4135-a114-4cc735150c43" (UID: "f6afc1e0-d554-4135-a114-4cc735150c43"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.477811 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f6afc1e0-d554-4135-a114-4cc735150c43" (UID: "f6afc1e0-d554-4135-a114-4cc735150c43"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.484315 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6afc1e0-d554-4135-a114-4cc735150c43-kube-api-access-cm7nd" (OuterVolumeSpecName: "kube-api-access-cm7nd") pod "f6afc1e0-d554-4135-a114-4cc735150c43" (UID: "f6afc1e0-d554-4135-a114-4cc735150c43"). InnerVolumeSpecName "kube-api-access-cm7nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.484414 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6afc1e0-d554-4135-a114-4cc735150c43-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f6afc1e0-d554-4135-a114-4cc735150c43" (UID: "f6afc1e0-d554-4135-a114-4cc735150c43"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.486288 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6afc1e0-d554-4135-a114-4cc735150c43-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f6afc1e0-d554-4135-a114-4cc735150c43" (UID: "f6afc1e0-d554-4135-a114-4cc735150c43"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.495739 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvfjc\" (UniqueName: \"kubernetes.io/projected/189aea5a-3eb2-41b7-9431-21a0acf13db7-kube-api-access-bvfjc\") pod \"placement-db-create-xlh2r\" (UID: \"189aea5a-3eb2-41b7-9431-21a0acf13db7\") " pod="openstack/placement-db-create-xlh2r" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.509227 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xlh2r" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.570900 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5sz9\" (UniqueName: \"kubernetes.io/projected/76ca76af-146b-4bb6-b676-1f9c8fd7f512-kube-api-access-d5sz9\") pod \"placement-7b88-account-create-update-kxcml\" (UID: \"76ca76af-146b-4bb6-b676-1f9c8fd7f512\") " pod="openstack/placement-7b88-account-create-update-kxcml" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.571008 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76ca76af-146b-4bb6-b676-1f9c8fd7f512-operator-scripts\") pod \"placement-7b88-account-create-update-kxcml\" (UID: \"76ca76af-146b-4bb6-b676-1f9c8fd7f512\") " pod="openstack/placement-7b88-account-create-update-kxcml" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.571120 4776 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-console-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.571134 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.571145 4776 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6afc1e0-d554-4135-a114-4cc735150c43-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.571442 4776 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.571547 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm7nd\" (UniqueName: \"kubernetes.io/projected/f6afc1e0-d554-4135-a114-4cc735150c43-kube-api-access-cm7nd\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.571575 4776 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6afc1e0-d554-4135-a114-4cc735150c43-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.571586 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6afc1e0-d554-4135-a114-4cc735150c43-service-ca\") on node \"crc\" DevicePath \"\"" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.572149 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76ca76af-146b-4bb6-b676-1f9c8fd7f512-operator-scripts\") pod \"placement-7b88-account-create-update-kxcml\" (UID: \"76ca76af-146b-4bb6-b676-1f9c8fd7f512\") " pod="openstack/placement-7b88-account-create-update-kxcml" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.581182 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp"] Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.587014 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5sz9\" (UniqueName: \"kubernetes.io/projected/76ca76af-146b-4bb6-b676-1f9c8fd7f512-kube-api-access-d5sz9\") pod \"placement-7b88-account-create-update-kxcml\" (UID: \"76ca76af-146b-4bb6-b676-1f9c8fd7f512\") " pod="openstack/placement-7b88-account-create-update-kxcml" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.609796 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b88-account-create-update-kxcml" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.704380 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-827c-account-create-update-2zj8b"] Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.741369 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp" event={"ID":"5c45c98e-e12c-404f-8684-6d34481e2cee","Type":"ContainerStarted","Data":"e6b214ede1d98b018aafa5355a93ad998915690830581e61db2a1c7b449c54e4"} Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.748246 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"786c0b37-638a-4b59-b149-628d9ad828bc","Type":"ContainerStarted","Data":"a6ea82a0a0fdbce89463cbb259477af5f32226f11448cdce50567a591f2cc6f2"} Dec 08 09:21:58 crc kubenswrapper[4776]: W1208 09:21:58.755579 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0920722c_0e43_40c4_8c74_63d6bbb9b419.slice/crio-83a21bc007d936f44305d25e35e882b2d26cf36a686b5401289541f0279c496c WatchSource:0}: Error finding container 83a21bc007d936f44305d25e35e882b2d26cf36a686b5401289541f0279c496c: Status 404 returned error can't find the container with id 83a21bc007d936f44305d25e35e882b2d26cf36a686b5401289541f0279c496c Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.757591 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-588757d595-b54s9_f6afc1e0-d554-4135-a114-4cc735150c43/console/0.log" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.757760 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-588757d595-b54s9" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.758431 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-588757d595-b54s9" event={"ID":"f6afc1e0-d554-4135-a114-4cc735150c43","Type":"ContainerDied","Data":"0bfb45fca9e0916bc267d5cff8e5432583aa7aeab826768222041f483c3383a8"} Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.758476 4776 scope.go:117] "RemoveContainer" containerID="32928cbe8a4b53150016bc031db50b46bfbdeedf8ebddfdde86638d7d5774545" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.778123 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994","Type":"ContainerStarted","Data":"44a9f7ec71ea62b7d079dba7205175959bf86b790c3ffbd0bd9ec7d8f84229bb"} Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.778424 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.783623 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a01574f0-d8c8-404a-b822-7ce8e0af6fd4","Type":"ContainerStarted","Data":"2890fe37e7829396223b419f85f8fdf135c4324b82cd2bab822c2cb2fb7fbd3a"} Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.784451 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.807396 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=44.020895076 podStartE2EDuration="55.807378986s" podCreationTimestamp="2025-12-08 09:21:03 +0000 UTC" firstStartedPulling="2025-12-08 09:21:11.179582431 +0000 UTC m=+1347.442807453" lastFinishedPulling="2025-12-08 09:21:22.966066341 +0000 UTC m=+1359.229291363" observedRunningTime="2025-12-08 09:21:58.807014857 +0000 UTC m=+1395.070239879" watchObservedRunningTime="2025-12-08 09:21:58.807378986 +0000 UTC m=+1395.070604008" Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.859454 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ww8jk"] Dec 08 09:21:58 crc kubenswrapper[4776]: I1208 09:21:58.861571 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=44.129694159 podStartE2EDuration="55.861550359s" podCreationTimestamp="2025-12-08 09:21:03 +0000 UTC" firstStartedPulling="2025-12-08 09:21:11.179142979 +0000 UTC m=+1347.442368011" lastFinishedPulling="2025-12-08 09:21:22.910999189 +0000 UTC m=+1359.174224211" observedRunningTime="2025-12-08 09:21:58.837974247 +0000 UTC m=+1395.101199289" watchObservedRunningTime="2025-12-08 09:21:58.861550359 +0000 UTC m=+1395.124775381" Dec 08 09:21:59 crc kubenswrapper[4776]: I1208 09:21:59.008845 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-588757d595-b54s9"] Dec 08 09:21:59 crc kubenswrapper[4776]: I1208 09:21:59.023131 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-588757d595-b54s9"] Dec 08 09:21:59 crc kubenswrapper[4776]: I1208 09:21:59.046613 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3646-account-create-update-9fgcf"] Dec 08 09:21:59 crc kubenswrapper[4776]: I1208 09:21:59.219151 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xlh2r"] Dec 08 09:21:59 crc kubenswrapper[4776]: W1208 09:21:59.221308 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod189aea5a_3eb2_41b7_9431_21a0acf13db7.slice/crio-9c91ebc7448a6702168dcce973a2aa3a78c91d074a22a0113ba64df855f9c758 WatchSource:0}: Error finding container 9c91ebc7448a6702168dcce973a2aa3a78c91d074a22a0113ba64df855f9c758: Status 404 returned error can't find the container with id 9c91ebc7448a6702168dcce973a2aa3a78c91d074a22a0113ba64df855f9c758 Dec 08 09:21:59 crc kubenswrapper[4776]: I1208 09:21:59.405051 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7b88-account-create-update-kxcml"] Dec 08 09:21:59 crc kubenswrapper[4776]: W1208 09:21:59.408414 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76ca76af_146b_4bb6_b676_1f9c8fd7f512.slice/crio-665e4d90d5b9ebbc538b8d62ce7a482340d3ad87623dd2003ee2513a1130c867 WatchSource:0}: Error finding container 665e4d90d5b9ebbc538b8d62ce7a482340d3ad87623dd2003ee2513a1130c867: Status 404 returned error can't find the container with id 665e4d90d5b9ebbc538b8d62ce7a482340d3ad87623dd2003ee2513a1130c867 Dec 08 09:21:59 crc kubenswrapper[4776]: I1208 09:21:59.791259 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3646-account-create-update-9fgcf" event={"ID":"de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf","Type":"ContainerStarted","Data":"6b9dd83b81a7d8013c1bf74b478009c9479c41e2c080c879bd26f66c66e76281"} Dec 08 09:21:59 crc kubenswrapper[4776]: I1208 09:21:59.792451 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xlh2r" event={"ID":"189aea5a-3eb2-41b7-9431-21a0acf13db7","Type":"ContainerStarted","Data":"9c91ebc7448a6702168dcce973a2aa3a78c91d074a22a0113ba64df855f9c758"} Dec 08 09:21:59 crc kubenswrapper[4776]: I1208 09:21:59.793599 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ww8jk" event={"ID":"e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746","Type":"ContainerStarted","Data":"dc39798b7c6745f6cc035bcf0b8259b4a2ec3edb0751d950cdf3b3f2f6627c1e"} Dec 08 09:21:59 crc kubenswrapper[4776]: I1208 09:21:59.794884 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b88-account-create-update-kxcml" event={"ID":"76ca76af-146b-4bb6-b676-1f9c8fd7f512","Type":"ContainerStarted","Data":"665e4d90d5b9ebbc538b8d62ce7a482340d3ad87623dd2003ee2513a1130c867"} Dec 08 09:21:59 crc kubenswrapper[4776]: I1208 09:21:59.795996 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-827c-account-create-update-2zj8b" event={"ID":"0920722c-0e43-40c4-8c74-63d6bbb9b419","Type":"ContainerStarted","Data":"83a21bc007d936f44305d25e35e882b2d26cf36a686b5401289541f0279c496c"} Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.355092 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6afc1e0-d554-4135-a114-4cc735150c43" path="/var/lib/kubelet/pods/f6afc1e0-d554-4135-a114-4cc735150c43/volumes" Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.810889 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3646-account-create-update-9fgcf" event={"ID":"de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf","Type":"ContainerStarted","Data":"6f68a56f22b1c104f29ecde6f5bfdc13dba2bb4af424c0415f68b59480d396e3"} Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.812312 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xlh2r" event={"ID":"189aea5a-3eb2-41b7-9431-21a0acf13db7","Type":"ContainerStarted","Data":"dc78358dc9ace872fae809d857c55d4d957daddc58146d5b31fe46d84b67bf5c"} Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.813562 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b88-account-create-update-kxcml" event={"ID":"76ca76af-146b-4bb6-b676-1f9c8fd7f512","Type":"ContainerStarted","Data":"6ae3cc48666cc8b589590c920969cd7ed45e832b0fd68e32e257c45dac047dc8"} Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.815056 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-827c-account-create-update-2zj8b" event={"ID":"0920722c-0e43-40c4-8c74-63d6bbb9b419","Type":"ContainerStarted","Data":"215c94b1341337698ac9fc9c0e9326d513646925688a984dd263fd1fb1f219e0"} Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.816983 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8l8r" event={"ID":"8fc47007-7b1d-458c-b1ee-f561fff88bd7","Type":"ContainerStarted","Data":"e4db7bb2f46397a744e23d7ff7362310822592c317b2c16c4bbc3a47b8ca9134"} Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.818794 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3d941bbc-2271-4ec4-853f-57feaf6ace36","Type":"ContainerStarted","Data":"90f0ba75b5fdd4e48c13558958358ed3a2b1fbcb29b322f1dbd987ffcfc55c33"} Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.820600 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"981d14af-244f-4679-975d-58e11df95718","Type":"ContainerStarted","Data":"6e5db0d16ef8149ece04445ab66efb711ec4f14c5d805dabaeebb0d7febb33dc"} Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.820836 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.821665 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ww8jk" event={"ID":"e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746","Type":"ContainerStarted","Data":"b7b95498f53d7901e339f9c83deba1b8e748ae9b78778385c70e68e7ac9e203b"} Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.823235 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp" event={"ID":"5c45c98e-e12c-404f-8684-6d34481e2cee","Type":"ContainerStarted","Data":"154c3c886ca8ad01f326fa76abb60ed578bd04ef518c989b81fa61e986454cee"} Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.841363 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-3646-account-create-update-9fgcf" podStartSLOduration=2.84134578 podStartE2EDuration="2.84134578s" podCreationTimestamp="2025-12-08 09:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:22:00.840699713 +0000 UTC m=+1397.103924735" watchObservedRunningTime="2025-12-08 09:22:00.84134578 +0000 UTC m=+1397.104570802" Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.874832 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=20.465988091 podStartE2EDuration="53.874814968s" podCreationTimestamp="2025-12-08 09:21:07 +0000 UTC" firstStartedPulling="2025-12-08 09:21:24.619503257 +0000 UTC m=+1360.882728279" lastFinishedPulling="2025-12-08 09:21:58.028330134 +0000 UTC m=+1394.291555156" observedRunningTime="2025-12-08 09:22:00.873134253 +0000 UTC m=+1397.136359275" watchObservedRunningTime="2025-12-08 09:22:00.874814968 +0000 UTC m=+1397.138039990" Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.928505 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-827c-account-create-update-2zj8b" podStartSLOduration=5.928490978 podStartE2EDuration="5.928490978s" podCreationTimestamp="2025-12-08 09:21:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:22:00.915972822 +0000 UTC m=+1397.179197844" watchObservedRunningTime="2025-12-08 09:22:00.928490978 +0000 UTC m=+1397.191716000" Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.960496 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-ww8jk" podStartSLOduration=3.960479646 podStartE2EDuration="3.960479646s" podCreationTimestamp="2025-12-08 09:21:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:22:00.959795898 +0000 UTC m=+1397.223020920" watchObservedRunningTime="2025-12-08 09:22:00.960479646 +0000 UTC m=+1397.223704668" Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.965468 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.661717941 podStartE2EDuration="45.96545979s" podCreationTimestamp="2025-12-08 09:21:15 +0000 UTC" firstStartedPulling="2025-12-08 09:21:24.724265805 +0000 UTC m=+1360.987490827" lastFinishedPulling="2025-12-08 09:21:58.028007654 +0000 UTC m=+1394.291232676" observedRunningTime="2025-12-08 09:22:00.943507221 +0000 UTC m=+1397.206732243" watchObservedRunningTime="2025-12-08 09:22:00.96545979 +0000 UTC m=+1397.228684812" Dec 08 09:22:00 crc kubenswrapper[4776]: I1208 09:22:00.984382 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp" podStartSLOduration=5.984364737 podStartE2EDuration="5.984364737s" podCreationTimestamp="2025-12-08 09:21:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:22:00.978142851 +0000 UTC m=+1397.241367873" watchObservedRunningTime="2025-12-08 09:22:00.984364737 +0000 UTC m=+1397.247589759" Dec 08 09:22:01 crc kubenswrapper[4776]: I1208 09:22:01.002136 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-xlh2r" podStartSLOduration=3.002116893 podStartE2EDuration="3.002116893s" podCreationTimestamp="2025-12-08 09:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:22:00.994154499 +0000 UTC m=+1397.257379521" watchObservedRunningTime="2025-12-08 09:22:01.002116893 +0000 UTC m=+1397.265341915" Dec 08 09:22:01 crc kubenswrapper[4776]: I1208 09:22:01.021516 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7b88-account-create-update-kxcml" podStartSLOduration=3.021495383 podStartE2EDuration="3.021495383s" podCreationTimestamp="2025-12-08 09:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:22:01.011135055 +0000 UTC m=+1397.274360077" watchObservedRunningTime="2025-12-08 09:22:01.021495383 +0000 UTC m=+1397.284720405" Dec 08 09:22:01 crc kubenswrapper[4776]: I1208 09:22:01.842540 4776 generic.go:334] "Generic (PLEG): container finished" podID="8fc47007-7b1d-458c-b1ee-f561fff88bd7" containerID="e4db7bb2f46397a744e23d7ff7362310822592c317b2c16c4bbc3a47b8ca9134" exitCode=0 Dec 08 09:22:01 crc kubenswrapper[4776]: I1208 09:22:01.842633 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8l8r" event={"ID":"8fc47007-7b1d-458c-b1ee-f561fff88bd7","Type":"ContainerDied","Data":"e4db7bb2f46397a744e23d7ff7362310822592c317b2c16c4bbc3a47b8ca9134"} Dec 08 09:22:01 crc kubenswrapper[4776]: I1208 09:22:01.844863 4776 generic.go:334] "Generic (PLEG): container finished" podID="189aea5a-3eb2-41b7-9431-21a0acf13db7" containerID="dc78358dc9ace872fae809d857c55d4d957daddc58146d5b31fe46d84b67bf5c" exitCode=0 Dec 08 09:22:01 crc kubenswrapper[4776]: I1208 09:22:01.844930 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xlh2r" event={"ID":"189aea5a-3eb2-41b7-9431-21a0acf13db7","Type":"ContainerDied","Data":"dc78358dc9ace872fae809d857c55d4d957daddc58146d5b31fe46d84b67bf5c"} Dec 08 09:22:01 crc kubenswrapper[4776]: I1208 09:22:01.846884 4776 generic.go:334] "Generic (PLEG): container finished" podID="e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746" containerID="b7b95498f53d7901e339f9c83deba1b8e748ae9b78778385c70e68e7ac9e203b" exitCode=0 Dec 08 09:22:01 crc kubenswrapper[4776]: I1208 09:22:01.846930 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ww8jk" event={"ID":"e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746","Type":"ContainerDied","Data":"b7b95498f53d7901e339f9c83deba1b8e748ae9b78778385c70e68e7ac9e203b"} Dec 08 09:22:01 crc kubenswrapper[4776]: I1208 09:22:01.850247 4776 generic.go:334] "Generic (PLEG): container finished" podID="5c45c98e-e12c-404f-8684-6d34481e2cee" containerID="154c3c886ca8ad01f326fa76abb60ed578bd04ef518c989b81fa61e986454cee" exitCode=0 Dec 08 09:22:01 crc kubenswrapper[4776]: I1208 09:22:01.850323 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp" event={"ID":"5c45c98e-e12c-404f-8684-6d34481e2cee","Type":"ContainerDied","Data":"154c3c886ca8ad01f326fa76abb60ed578bd04ef518c989b81fa61e986454cee"} Dec 08 09:22:01 crc kubenswrapper[4776]: I1208 09:22:01.853287 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"786c0b37-638a-4b59-b149-628d9ad828bc","Type":"ContainerStarted","Data":"d647656c5e94e4b5bcc9ec6ef7af6889f977f3689f9074bd4d8bab32d9fcc049"} Dec 08 09:22:01 crc kubenswrapper[4776]: I1208 09:22:01.855850 4776 generic.go:334] "Generic (PLEG): container finished" podID="0920722c-0e43-40c4-8c74-63d6bbb9b419" containerID="215c94b1341337698ac9fc9c0e9326d513646925688a984dd263fd1fb1f219e0" exitCode=0 Dec 08 09:22:01 crc kubenswrapper[4776]: I1208 09:22:01.855899 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-827c-account-create-update-2zj8b" event={"ID":"0920722c-0e43-40c4-8c74-63d6bbb9b419","Type":"ContainerDied","Data":"215c94b1341337698ac9fc9c0e9326d513646925688a984dd263fd1fb1f219e0"} Dec 08 09:22:01 crc kubenswrapper[4776]: I1208 09:22:01.937522 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 08 09:22:01 crc kubenswrapper[4776]: I1208 09:22:01.937565 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 08 09:22:02 crc kubenswrapper[4776]: I1208 09:22:02.867821 4776 generic.go:334] "Generic (PLEG): container finished" podID="76ca76af-146b-4bb6-b676-1f9c8fd7f512" containerID="6ae3cc48666cc8b589590c920969cd7ed45e832b0fd68e32e257c45dac047dc8" exitCode=0 Dec 08 09:22:02 crc kubenswrapper[4776]: I1208 09:22:02.867883 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b88-account-create-update-kxcml" event={"ID":"76ca76af-146b-4bb6-b676-1f9c8fd7f512","Type":"ContainerDied","Data":"6ae3cc48666cc8b589590c920969cd7ed45e832b0fd68e32e257c45dac047dc8"} Dec 08 09:22:02 crc kubenswrapper[4776]: I1208 09:22:02.873682 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8l8r" event={"ID":"8fc47007-7b1d-458c-b1ee-f561fff88bd7","Type":"ContainerStarted","Data":"60f3369aeb65ec7d79d738bd4cc339f62dfff7705f56cba17ee799e904726704"} Dec 08 09:22:02 crc kubenswrapper[4776]: I1208 09:22:02.896688 4776 generic.go:334] "Generic (PLEG): container finished" podID="de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf" containerID="6f68a56f22b1c104f29ecde6f5bfdc13dba2bb4af424c0415f68b59480d396e3" exitCode=0 Dec 08 09:22:02 crc kubenswrapper[4776]: I1208 09:22:02.902351 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3646-account-create-update-9fgcf" event={"ID":"de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf","Type":"ContainerDied","Data":"6f68a56f22b1c104f29ecde6f5bfdc13dba2bb4af424c0415f68b59480d396e3"} Dec 08 09:22:02 crc kubenswrapper[4776]: I1208 09:22:02.909992 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w8l8r" podStartSLOduration=6.145117016 podStartE2EDuration="10.909971433s" podCreationTimestamp="2025-12-08 09:21:52 +0000 UTC" firstStartedPulling="2025-12-08 09:21:57.659425525 +0000 UTC m=+1393.922650547" lastFinishedPulling="2025-12-08 09:22:02.424279942 +0000 UTC m=+1398.687504964" observedRunningTime="2025-12-08 09:22:02.906927132 +0000 UTC m=+1399.170152174" watchObservedRunningTime="2025-12-08 09:22:02.909971433 +0000 UTC m=+1399.173196455" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.175776 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.175941 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.463393 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.576116 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c45c98e-e12c-404f-8684-6d34481e2cee-operator-scripts\") pod \"5c45c98e-e12c-404f-8684-6d34481e2cee\" (UID: \"5c45c98e-e12c-404f-8684-6d34481e2cee\") " Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.576263 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7knf\" (UniqueName: \"kubernetes.io/projected/5c45c98e-e12c-404f-8684-6d34481e2cee-kube-api-access-w7knf\") pod \"5c45c98e-e12c-404f-8684-6d34481e2cee\" (UID: \"5c45c98e-e12c-404f-8684-6d34481e2cee\") " Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.577587 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c45c98e-e12c-404f-8684-6d34481e2cee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c45c98e-e12c-404f-8684-6d34481e2cee" (UID: "5c45c98e-e12c-404f-8684-6d34481e2cee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.583330 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c45c98e-e12c-404f-8684-6d34481e2cee-kube-api-access-w7knf" (OuterVolumeSpecName: "kube-api-access-w7knf") pod "5c45c98e-e12c-404f-8684-6d34481e2cee" (UID: "5c45c98e-e12c-404f-8684-6d34481e2cee"). InnerVolumeSpecName "kube-api-access-w7knf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.678848 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c45c98e-e12c-404f-8684-6d34481e2cee-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.678889 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7knf\" (UniqueName: \"kubernetes.io/projected/5c45c98e-e12c-404f-8684-6d34481e2cee-kube-api-access-w7knf\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.678934 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ww8jk" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.686033 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xlh2r" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.690104 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-827c-account-create-update-2zj8b" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.780142 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvfjc\" (UniqueName: \"kubernetes.io/projected/189aea5a-3eb2-41b7-9431-21a0acf13db7-kube-api-access-bvfjc\") pod \"189aea5a-3eb2-41b7-9431-21a0acf13db7\" (UID: \"189aea5a-3eb2-41b7-9431-21a0acf13db7\") " Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.780220 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746-operator-scripts\") pod \"e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746\" (UID: \"e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746\") " Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.780279 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0920722c-0e43-40c4-8c74-63d6bbb9b419-operator-scripts\") pod \"0920722c-0e43-40c4-8c74-63d6bbb9b419\" (UID: \"0920722c-0e43-40c4-8c74-63d6bbb9b419\") " Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.780355 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/189aea5a-3eb2-41b7-9431-21a0acf13db7-operator-scripts\") pod \"189aea5a-3eb2-41b7-9431-21a0acf13db7\" (UID: \"189aea5a-3eb2-41b7-9431-21a0acf13db7\") " Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.780410 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbsx7\" (UniqueName: \"kubernetes.io/projected/0920722c-0e43-40c4-8c74-63d6bbb9b419-kube-api-access-bbsx7\") pod \"0920722c-0e43-40c4-8c74-63d6bbb9b419\" (UID: \"0920722c-0e43-40c4-8c74-63d6bbb9b419\") " Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.780449 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djplb\" (UniqueName: \"kubernetes.io/projected/e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746-kube-api-access-djplb\") pod \"e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746\" (UID: \"e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746\") " Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.781752 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0920722c-0e43-40c4-8c74-63d6bbb9b419-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0920722c-0e43-40c4-8c74-63d6bbb9b419" (UID: "0920722c-0e43-40c4-8c74-63d6bbb9b419"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.785963 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746" (UID: "e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.787072 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/189aea5a-3eb2-41b7-9431-21a0acf13db7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "189aea5a-3eb2-41b7-9431-21a0acf13db7" (UID: "189aea5a-3eb2-41b7-9431-21a0acf13db7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.788458 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746-kube-api-access-djplb" (OuterVolumeSpecName: "kube-api-access-djplb") pod "e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746" (UID: "e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746"). InnerVolumeSpecName "kube-api-access-djplb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.789504 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0920722c-0e43-40c4-8c74-63d6bbb9b419-kube-api-access-bbsx7" (OuterVolumeSpecName: "kube-api-access-bbsx7") pod "0920722c-0e43-40c4-8c74-63d6bbb9b419" (UID: "0920722c-0e43-40c4-8c74-63d6bbb9b419"). InnerVolumeSpecName "kube-api-access-bbsx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.791250 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189aea5a-3eb2-41b7-9431-21a0acf13db7-kube-api-access-bvfjc" (OuterVolumeSpecName: "kube-api-access-bvfjc") pod "189aea5a-3eb2-41b7-9431-21a0acf13db7" (UID: "189aea5a-3eb2-41b7-9431-21a0acf13db7"). InnerVolumeSpecName "kube-api-access-bvfjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.883989 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0920722c-0e43-40c4-8c74-63d6bbb9b419-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.884021 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/189aea5a-3eb2-41b7-9431-21a0acf13db7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.884031 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbsx7\" (UniqueName: \"kubernetes.io/projected/0920722c-0e43-40c4-8c74-63d6bbb9b419-kube-api-access-bbsx7\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.884043 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djplb\" (UniqueName: \"kubernetes.io/projected/e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746-kube-api-access-djplb\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.884052 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvfjc\" (UniqueName: \"kubernetes.io/projected/189aea5a-3eb2-41b7-9431-21a0acf13db7-kube-api-access-bvfjc\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.884061 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.926192 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xlh2r" event={"ID":"189aea5a-3eb2-41b7-9431-21a0acf13db7","Type":"ContainerDied","Data":"9c91ebc7448a6702168dcce973a2aa3a78c91d074a22a0113ba64df855f9c758"} Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.926233 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c91ebc7448a6702168dcce973a2aa3a78c91d074a22a0113ba64df855f9c758" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.926236 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xlh2r" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.930207 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ww8jk" event={"ID":"e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746","Type":"ContainerDied","Data":"dc39798b7c6745f6cc035bcf0b8259b4a2ec3edb0751d950cdf3b3f2f6627c1e"} Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.930251 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc39798b7c6745f6cc035bcf0b8259b4a2ec3edb0751d950cdf3b3f2f6627c1e" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.931358 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ww8jk" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.940660 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp" event={"ID":"5c45c98e-e12c-404f-8684-6d34481e2cee","Type":"ContainerDied","Data":"e6b214ede1d98b018aafa5355a93ad998915690830581e61db2a1c7b449c54e4"} Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.940698 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6b214ede1d98b018aafa5355a93ad998915690830581e61db2a1c7b449c54e4" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.940751 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.945542 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-827c-account-create-update-2zj8b" event={"ID":"0920722c-0e43-40c4-8c74-63d6bbb9b419","Type":"ContainerDied","Data":"83a21bc007d936f44305d25e35e882b2d26cf36a686b5401289541f0279c496c"} Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.945586 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83a21bc007d936f44305d25e35e882b2d26cf36a686b5401289541f0279c496c" Dec 08 09:22:03 crc kubenswrapper[4776]: I1208 09:22:03.945749 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-827c-account-create-update-2zj8b" Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.240274 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w8l8r" podUID="8fc47007-7b1d-458c-b1ee-f561fff88bd7" containerName="registry-server" probeResult="failure" output=< Dec 08 09:22:04 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 09:22:04 crc kubenswrapper[4776]: > Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.340290 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3646-account-create-update-9fgcf" Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.343812 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b88-account-create-update-kxcml" Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.394163 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf-operator-scripts\") pod \"de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf\" (UID: \"de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf\") " Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.394258 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp2sf\" (UniqueName: \"kubernetes.io/projected/de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf-kube-api-access-vp2sf\") pod \"de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf\" (UID: \"de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf\") " Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.394372 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5sz9\" (UniqueName: \"kubernetes.io/projected/76ca76af-146b-4bb6-b676-1f9c8fd7f512-kube-api-access-d5sz9\") pod \"76ca76af-146b-4bb6-b676-1f9c8fd7f512\" (UID: \"76ca76af-146b-4bb6-b676-1f9c8fd7f512\") " Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.394434 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76ca76af-146b-4bb6-b676-1f9c8fd7f512-operator-scripts\") pod \"76ca76af-146b-4bb6-b676-1f9c8fd7f512\" (UID: \"76ca76af-146b-4bb6-b676-1f9c8fd7f512\") " Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.395097 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf" (UID: "de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.395537 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76ca76af-146b-4bb6-b676-1f9c8fd7f512-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "76ca76af-146b-4bb6-b676-1f9c8fd7f512" (UID: "76ca76af-146b-4bb6-b676-1f9c8fd7f512"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.398513 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76ca76af-146b-4bb6-b676-1f9c8fd7f512-kube-api-access-d5sz9" (OuterVolumeSpecName: "kube-api-access-d5sz9") pod "76ca76af-146b-4bb6-b676-1f9c8fd7f512" (UID: "76ca76af-146b-4bb6-b676-1f9c8fd7f512"). InnerVolumeSpecName "kube-api-access-d5sz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.398619 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf-kube-api-access-vp2sf" (OuterVolumeSpecName: "kube-api-access-vp2sf") pod "de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf" (UID: "de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf"). InnerVolumeSpecName "kube-api-access-vp2sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.497020 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.497065 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp2sf\" (UniqueName: \"kubernetes.io/projected/de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf-kube-api-access-vp2sf\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.497079 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5sz9\" (UniqueName: \"kubernetes.io/projected/76ca76af-146b-4bb6-b676-1f9c8fd7f512-kube-api-access-d5sz9\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.497089 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76ca76af-146b-4bb6-b676-1f9c8fd7f512-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.959210 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b88-account-create-update-kxcml" event={"ID":"76ca76af-146b-4bb6-b676-1f9c8fd7f512","Type":"ContainerDied","Data":"665e4d90d5b9ebbc538b8d62ce7a482340d3ad87623dd2003ee2513a1130c867"} Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.959255 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="665e4d90d5b9ebbc538b8d62ce7a482340d3ad87623dd2003ee2513a1130c867" Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.959305 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b88-account-create-update-kxcml" Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.962778 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3646-account-create-update-9fgcf" event={"ID":"de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf","Type":"ContainerDied","Data":"6b9dd83b81a7d8013c1bf74b478009c9479c41e2c080c879bd26f66c66e76281"} Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.962815 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3646-account-create-update-9fgcf" Dec 08 09:22:04 crc kubenswrapper[4776]: I1208 09:22:04.962854 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b9dd83b81a7d8013c1bf74b478009c9479c41e2c080c879bd26f66c66e76281" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.004661 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.117522 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.422577 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 08 09:22:05 crc kubenswrapper[4776]: E1208 09:22:05.423345 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf" containerName="mariadb-account-create-update" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.423367 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf" containerName="mariadb-account-create-update" Dec 08 09:22:05 crc kubenswrapper[4776]: E1208 09:22:05.423389 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189aea5a-3eb2-41b7-9431-21a0acf13db7" containerName="mariadb-database-create" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.423396 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="189aea5a-3eb2-41b7-9431-21a0acf13db7" containerName="mariadb-database-create" Dec 08 09:22:05 crc kubenswrapper[4776]: E1208 09:22:05.423416 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0920722c-0e43-40c4-8c74-63d6bbb9b419" containerName="mariadb-account-create-update" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.423422 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0920722c-0e43-40c4-8c74-63d6bbb9b419" containerName="mariadb-account-create-update" Dec 08 09:22:05 crc kubenswrapper[4776]: E1208 09:22:05.423431 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c45c98e-e12c-404f-8684-6d34481e2cee" containerName="mariadb-database-create" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.423437 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c45c98e-e12c-404f-8684-6d34481e2cee" containerName="mariadb-database-create" Dec 08 09:22:05 crc kubenswrapper[4776]: E1208 09:22:05.423447 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746" containerName="mariadb-database-create" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.423453 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746" containerName="mariadb-database-create" Dec 08 09:22:05 crc kubenswrapper[4776]: E1208 09:22:05.423469 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ca76af-146b-4bb6-b676-1f9c8fd7f512" containerName="mariadb-account-create-update" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.423475 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ca76af-146b-4bb6-b676-1f9c8fd7f512" containerName="mariadb-account-create-update" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.423701 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746" containerName="mariadb-database-create" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.423720 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="76ca76af-146b-4bb6-b676-1f9c8fd7f512" containerName="mariadb-account-create-update" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.423737 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="189aea5a-3eb2-41b7-9431-21a0acf13db7" containerName="mariadb-database-create" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.423760 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0920722c-0e43-40c4-8c74-63d6bbb9b419" containerName="mariadb-account-create-update" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.423771 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf" containerName="mariadb-account-create-update" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.423788 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c45c98e-e12c-404f-8684-6d34481e2cee" containerName="mariadb-database-create" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.424861 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.428242 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.429675 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.429678 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.430301 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.431478 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-9zbsd" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.528208 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.528253 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.528503 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.528593 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.528669 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q58hq\" (UniqueName: \"kubernetes.io/projected/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-kube-api-access-q58hq\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.528849 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-scripts\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.528876 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-config\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.630815 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.630865 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.630970 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.630994 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.631018 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q58hq\" (UniqueName: \"kubernetes.io/projected/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-kube-api-access-q58hq\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.631072 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-scripts\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.631089 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-config\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.632064 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.632150 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-config\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.632870 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-scripts\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.637011 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.637219 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.637677 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.652904 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q58hq\" (UniqueName: \"kubernetes.io/projected/96dd2435-6c8f-4ac2-9b72-43f82d2eeb52-kube-api-access-q58hq\") pod \"ovn-northd-0\" (UID: \"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52\") " pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.697185 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.698570 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.703757 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.713309 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.732645 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ed126e-a923-408f-9ab3-f939a1e74374-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"54ed126e-a923-408f-9ab3-f939a1e74374\") " pod="openstack/mysqld-exporter-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.732776 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khmjg\" (UniqueName: \"kubernetes.io/projected/54ed126e-a923-408f-9ab3-f939a1e74374-kube-api-access-khmjg\") pod \"mysqld-exporter-0\" (UID: \"54ed126e-a923-408f-9ab3-f939a1e74374\") " pod="openstack/mysqld-exporter-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.732902 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ed126e-a923-408f-9ab3-f939a1e74374-config-data\") pod \"mysqld-exporter-0\" (UID: \"54ed126e-a923-408f-9ab3-f939a1e74374\") " pod="openstack/mysqld-exporter-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.740534 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.834568 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khmjg\" (UniqueName: \"kubernetes.io/projected/54ed126e-a923-408f-9ab3-f939a1e74374-kube-api-access-khmjg\") pod \"mysqld-exporter-0\" (UID: \"54ed126e-a923-408f-9ab3-f939a1e74374\") " pod="openstack/mysqld-exporter-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.835044 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ed126e-a923-408f-9ab3-f939a1e74374-config-data\") pod \"mysqld-exporter-0\" (UID: \"54ed126e-a923-408f-9ab3-f939a1e74374\") " pod="openstack/mysqld-exporter-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.835187 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ed126e-a923-408f-9ab3-f939a1e74374-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"54ed126e-a923-408f-9ab3-f939a1e74374\") " pod="openstack/mysqld-exporter-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.840989 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ed126e-a923-408f-9ab3-f939a1e74374-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"54ed126e-a923-408f-9ab3-f939a1e74374\") " pod="openstack/mysqld-exporter-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.857891 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ed126e-a923-408f-9ab3-f939a1e74374-config-data\") pod \"mysqld-exporter-0\" (UID: \"54ed126e-a923-408f-9ab3-f939a1e74374\") " pod="openstack/mysqld-exporter-0" Dec 08 09:22:05 crc kubenswrapper[4776]: I1208 09:22:05.877591 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khmjg\" (UniqueName: \"kubernetes.io/projected/54ed126e-a923-408f-9ab3-f939a1e74374-kube-api-access-khmjg\") pod \"mysqld-exporter-0\" (UID: \"54ed126e-a923-408f-9ab3-f939a1e74374\") " pod="openstack/mysqld-exporter-0" Dec 08 09:22:06 crc kubenswrapper[4776]: I1208 09:22:06.046305 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 08 09:22:07 crc kubenswrapper[4776]: I1208 09:22:07.261824 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 08 09:22:07 crc kubenswrapper[4776]: W1208 09:22:07.267501 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54ed126e_a923_408f_9ab3_f939a1e74374.slice/crio-343371fcd574618a02056362a1ab892606fc3b8b955ee394fb4318cc9089eb9f WatchSource:0}: Error finding container 343371fcd574618a02056362a1ab892606fc3b8b955ee394fb4318cc9089eb9f: Status 404 returned error can't find the container with id 343371fcd574618a02056362a1ab892606fc3b8b955ee394fb4318cc9089eb9f Dec 08 09:22:07 crc kubenswrapper[4776]: I1208 09:22:07.360007 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 08 09:22:07 crc kubenswrapper[4776]: W1208 09:22:07.360936 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96dd2435_6c8f_4ac2_9b72_43f82d2eeb52.slice/crio-d170b221bf1bb9af21825ce6ed95e0e7a3883b110c943e4a5028114d5d222685 WatchSource:0}: Error finding container d170b221bf1bb9af21825ce6ed95e0e7a3883b110c943e4a5028114d5d222685: Status 404 returned error can't find the container with id d170b221bf1bb9af21825ce6ed95e0e7a3883b110c943e4a5028114d5d222685 Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.015700 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52","Type":"ContainerStarted","Data":"d170b221bf1bb9af21825ce6ed95e0e7a3883b110c943e4a5028114d5d222685"} Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.019965 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"54ed126e-a923-408f-9ab3-f939a1e74374","Type":"ContainerStarted","Data":"343371fcd574618a02056362a1ab892606fc3b8b955ee394fb4318cc9089eb9f"} Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.024144 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"786c0b37-638a-4b59-b149-628d9ad828bc","Type":"ContainerStarted","Data":"2ef939e0e3310016b12741d23ba72538c85bdced4c4ddb306d993e02317c1238"} Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.055416 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.931888662 podStartE2EDuration="59.055388371s" podCreationTimestamp="2025-12-08 09:21:09 +0000 UTC" firstStartedPulling="2025-12-08 09:21:24.645783714 +0000 UTC m=+1360.909008736" lastFinishedPulling="2025-12-08 09:22:06.769283423 +0000 UTC m=+1403.032508445" observedRunningTime="2025-12-08 09:22:08.047037077 +0000 UTC m=+1404.310262109" watchObservedRunningTime="2025-12-08 09:22:08.055388371 +0000 UTC m=+1404.318613393" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.256245 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.495303 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d26a-account-create-update-wr6lj"] Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.496851 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d26a-account-create-update-wr6lj" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.499274 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.511337 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-xm2wh"] Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.512946 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xm2wh" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.537328 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d26a-account-create-update-wr6lj"] Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.555088 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-xm2wh"] Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.606704 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlds4\" (UniqueName: \"kubernetes.io/projected/e0571661-99e6-43e0-b2ea-1924e5437a7f-kube-api-access-zlds4\") pod \"glance-d26a-account-create-update-wr6lj\" (UID: \"e0571661-99e6-43e0-b2ea-1924e5437a7f\") " pod="openstack/glance-d26a-account-create-update-wr6lj" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.606756 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a94fb6fb-5b7e-4034-a8e4-9c40e269409e-operator-scripts\") pod \"glance-db-create-xm2wh\" (UID: \"a94fb6fb-5b7e-4034-a8e4-9c40e269409e\") " pod="openstack/glance-db-create-xm2wh" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.606824 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf7md\" (UniqueName: \"kubernetes.io/projected/a94fb6fb-5b7e-4034-a8e4-9c40e269409e-kube-api-access-gf7md\") pod \"glance-db-create-xm2wh\" (UID: \"a94fb6fb-5b7e-4034-a8e4-9c40e269409e\") " pod="openstack/glance-db-create-xm2wh" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.606853 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0571661-99e6-43e0-b2ea-1924e5437a7f-operator-scripts\") pod \"glance-d26a-account-create-update-wr6lj\" (UID: \"e0571661-99e6-43e0-b2ea-1924e5437a7f\") " pod="openstack/glance-d26a-account-create-update-wr6lj" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.709070 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf7md\" (UniqueName: \"kubernetes.io/projected/a94fb6fb-5b7e-4034-a8e4-9c40e269409e-kube-api-access-gf7md\") pod \"glance-db-create-xm2wh\" (UID: \"a94fb6fb-5b7e-4034-a8e4-9c40e269409e\") " pod="openstack/glance-db-create-xm2wh" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.709130 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0571661-99e6-43e0-b2ea-1924e5437a7f-operator-scripts\") pod \"glance-d26a-account-create-update-wr6lj\" (UID: \"e0571661-99e6-43e0-b2ea-1924e5437a7f\") " pod="openstack/glance-d26a-account-create-update-wr6lj" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.709279 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlds4\" (UniqueName: \"kubernetes.io/projected/e0571661-99e6-43e0-b2ea-1924e5437a7f-kube-api-access-zlds4\") pod \"glance-d26a-account-create-update-wr6lj\" (UID: \"e0571661-99e6-43e0-b2ea-1924e5437a7f\") " pod="openstack/glance-d26a-account-create-update-wr6lj" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.709308 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a94fb6fb-5b7e-4034-a8e4-9c40e269409e-operator-scripts\") pod \"glance-db-create-xm2wh\" (UID: \"a94fb6fb-5b7e-4034-a8e4-9c40e269409e\") " pod="openstack/glance-db-create-xm2wh" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.710060 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a94fb6fb-5b7e-4034-a8e4-9c40e269409e-operator-scripts\") pod \"glance-db-create-xm2wh\" (UID: \"a94fb6fb-5b7e-4034-a8e4-9c40e269409e\") " pod="openstack/glance-db-create-xm2wh" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.710106 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0571661-99e6-43e0-b2ea-1924e5437a7f-operator-scripts\") pod \"glance-d26a-account-create-update-wr6lj\" (UID: \"e0571661-99e6-43e0-b2ea-1924e5437a7f\") " pod="openstack/glance-d26a-account-create-update-wr6lj" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.727282 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf7md\" (UniqueName: \"kubernetes.io/projected/a94fb6fb-5b7e-4034-a8e4-9c40e269409e-kube-api-access-gf7md\") pod \"glance-db-create-xm2wh\" (UID: \"a94fb6fb-5b7e-4034-a8e4-9c40e269409e\") " pod="openstack/glance-db-create-xm2wh" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.735665 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlds4\" (UniqueName: \"kubernetes.io/projected/e0571661-99e6-43e0-b2ea-1924e5437a7f-kube-api-access-zlds4\") pod \"glance-d26a-account-create-update-wr6lj\" (UID: \"e0571661-99e6-43e0-b2ea-1924e5437a7f\") " pod="openstack/glance-d26a-account-create-update-wr6lj" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.876297 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d26a-account-create-update-wr6lj" Dec 08 09:22:08 crc kubenswrapper[4776]: I1208 09:22:08.917876 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xm2wh" Dec 08 09:22:09 crc kubenswrapper[4776]: I1208 09:22:09.863975 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d26a-account-create-update-wr6lj"] Dec 08 09:22:09 crc kubenswrapper[4776]: I1208 09:22:09.918891 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-5ss7n"] Dec 08 09:22:09 crc kubenswrapper[4776]: I1208 09:22:09.921009 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:09 crc kubenswrapper[4776]: I1208 09:22:09.963504 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5ss7n"] Dec 08 09:22:09 crc kubenswrapper[4776]: I1208 09:22:09.977665 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-xm2wh"] Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.042464 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cq95\" (UniqueName: \"kubernetes.io/projected/8054440d-20b3-498d-80a6-da7ee23c9864-kube-api-access-9cq95\") pod \"dnsmasq-dns-698758b865-5ss7n\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.042665 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-5ss7n\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.042756 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-config\") pod \"dnsmasq-dns-698758b865-5ss7n\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.042863 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-dns-svc\") pod \"dnsmasq-dns-698758b865-5ss7n\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.043006 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-5ss7n\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.073952 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xm2wh" event={"ID":"a94fb6fb-5b7e-4034-a8e4-9c40e269409e","Type":"ContainerStarted","Data":"f766ddc6ad6315adeb2eb425d90cbee94cfbc89f3dd801fa09cdf67491483dd9"} Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.075305 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d26a-account-create-update-wr6lj" event={"ID":"e0571661-99e6-43e0-b2ea-1924e5437a7f","Type":"ContainerStarted","Data":"d225ced0c32711ea69aa91bff616204f69f061971afddfb295d614bc039aecb9"} Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.076978 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52","Type":"ContainerStarted","Data":"f29c0115c319874d459cce82174ca4707033d327f773a6d0b7a013503edeaf64"} Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.076997 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"96dd2435-6c8f-4ac2-9b72-43f82d2eeb52","Type":"ContainerStarted","Data":"4d658ac1e522a0e72e0953b32c7e4d2d96a6d46b7dca070543ed128dfdb64945"} Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.077299 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.082334 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"54ed126e-a923-408f-9ab3-f939a1e74374","Type":"ContainerStarted","Data":"2389731669fe473d7a6e651ee6717ac54faf4f7cacce306be97cebc8a42c2d9e"} Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.108707 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.160590343 podStartE2EDuration="5.108685563s" podCreationTimestamp="2025-12-08 09:22:05 +0000 UTC" firstStartedPulling="2025-12-08 09:22:07.363025904 +0000 UTC m=+1403.626250926" lastFinishedPulling="2025-12-08 09:22:09.311121124 +0000 UTC m=+1405.574346146" observedRunningTime="2025-12-08 09:22:10.097030471 +0000 UTC m=+1406.360255513" watchObservedRunningTime="2025-12-08 09:22:10.108685563 +0000 UTC m=+1406.371910585" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.131023 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=3.021829191 podStartE2EDuration="5.131004792s" podCreationTimestamp="2025-12-08 09:22:05 +0000 UTC" firstStartedPulling="2025-12-08 09:22:07.27006335 +0000 UTC m=+1403.533288372" lastFinishedPulling="2025-12-08 09:22:09.379238951 +0000 UTC m=+1405.642463973" observedRunningTime="2025-12-08 09:22:10.121084946 +0000 UTC m=+1406.384309968" watchObservedRunningTime="2025-12-08 09:22:10.131004792 +0000 UTC m=+1406.394229804" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.160394 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-dns-svc\") pod \"dnsmasq-dns-698758b865-5ss7n\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.160565 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-5ss7n\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.160913 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cq95\" (UniqueName: \"kubernetes.io/projected/8054440d-20b3-498d-80a6-da7ee23c9864-kube-api-access-9cq95\") pod \"dnsmasq-dns-698758b865-5ss7n\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.160986 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-5ss7n\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.161029 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-config\") pod \"dnsmasq-dns-698758b865-5ss7n\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.162017 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-config\") pod \"dnsmasq-dns-698758b865-5ss7n\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.162667 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-dns-svc\") pod \"dnsmasq-dns-698758b865-5ss7n\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.164376 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-5ss7n\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.164627 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-5ss7n\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.210469 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cq95\" (UniqueName: \"kubernetes.io/projected/8054440d-20b3-498d-80a6-da7ee23c9864-kube-api-access-9cq95\") pod \"dnsmasq-dns-698758b865-5ss7n\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.239013 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.268976 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.732348 4776 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod639910a7-1d35-4535-b629-18fe52dacac3"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod639910a7-1d35-4535-b629-18fe52dacac3] : Timed out while waiting for systemd to remove kubepods-besteffort-pod639910a7_1d35_4535_b629_18fe52dacac3.slice" Dec 08 09:22:10 crc kubenswrapper[4776]: I1208 09:22:10.793549 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5ss7n"] Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.020402 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.027328 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.034650 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.035011 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.034757 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.035232 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-28xbd" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.039433 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.092449 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-cache\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.092546 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.092582 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp2v6\" (UniqueName: \"kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-kube-api-access-sp2v6\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.092617 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.092652 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-lock\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.151750 4776 generic.go:334] "Generic (PLEG): container finished" podID="a94fb6fb-5b7e-4034-a8e4-9c40e269409e" containerID="7436f875e146e77ea0960ce63ed08ec6c5456dcab6a8bfc87d62e0c1da12f314" exitCode=0 Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.151811 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xm2wh" event={"ID":"a94fb6fb-5b7e-4034-a8e4-9c40e269409e","Type":"ContainerDied","Data":"7436f875e146e77ea0960ce63ed08ec6c5456dcab6a8bfc87d62e0c1da12f314"} Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.167730 4776 generic.go:334] "Generic (PLEG): container finished" podID="8054440d-20b3-498d-80a6-da7ee23c9864" containerID="24269e54387f3d988a0353b4b6e6be70c886f9a8f303c9ff71cc0dd5ff8ac2ed" exitCode=0 Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.167789 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5ss7n" event={"ID":"8054440d-20b3-498d-80a6-da7ee23c9864","Type":"ContainerDied","Data":"24269e54387f3d988a0353b4b6e6be70c886f9a8f303c9ff71cc0dd5ff8ac2ed"} Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.167810 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5ss7n" event={"ID":"8054440d-20b3-498d-80a6-da7ee23c9864","Type":"ContainerStarted","Data":"dd44f9d3151110b78963bfee56ccd62294aee5568dca6a211d85a23a80d6b9a9"} Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.185477 4776 generic.go:334] "Generic (PLEG): container finished" podID="e0571661-99e6-43e0-b2ea-1924e5437a7f" containerID="90a74cf7f85524f89d537da4e7d9238ea9b3c5ee80872b88825143f5ca3c333c" exitCode=0 Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.186545 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d26a-account-create-update-wr6lj" event={"ID":"e0571661-99e6-43e0-b2ea-1924e5437a7f","Type":"ContainerDied","Data":"90a74cf7f85524f89d537da4e7d9238ea9b3c5ee80872b88825143f5ca3c333c"} Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.189086 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.189697 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.202085 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-cache\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.202227 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.202266 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp2v6\" (UniqueName: \"kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-kube-api-access-sp2v6\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.202295 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.202322 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-lock\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.202608 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.202749 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-cache\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: E1208 09:22:11.202800 4776 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 08 09:22:11 crc kubenswrapper[4776]: E1208 09:22:11.202815 4776 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.202829 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-lock\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: E1208 09:22:11.202850 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift podName:cb640491-a8e7-4f8d-b4bb-1d0124f5727f nodeName:}" failed. No retries permitted until 2025-12-08 09:22:11.70283582 +0000 UTC m=+1407.966060842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift") pod "swift-storage-0" (UID: "cb640491-a8e7-4f8d-b4bb-1d0124f5727f") : configmap "swift-ring-files" not found Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.232431 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.253179 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp2v6\" (UniqueName: \"kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-kube-api-access-sp2v6\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.264105 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.566544 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mmp8z"] Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.567796 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.569883 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.570057 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.576400 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.584295 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mmp8z"] Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.613445 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-combined-ca-bundle\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.613813 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0436afba-d4b2-47d8-ac4d-c621e029333d-scripts\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.613847 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqmh7\" (UniqueName: \"kubernetes.io/projected/0436afba-d4b2-47d8-ac4d-c621e029333d-kube-api-access-qqmh7\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.613908 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0436afba-d4b2-47d8-ac4d-c621e029333d-etc-swift\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.613946 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-dispersionconf\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.613966 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0436afba-d4b2-47d8-ac4d-c621e029333d-ring-data-devices\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.613990 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-swiftconf\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.715931 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0436afba-d4b2-47d8-ac4d-c621e029333d-scripts\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.715987 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqmh7\" (UniqueName: \"kubernetes.io/projected/0436afba-d4b2-47d8-ac4d-c621e029333d-kube-api-access-qqmh7\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.716035 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.716095 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0436afba-d4b2-47d8-ac4d-c621e029333d-etc-swift\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.716129 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-dispersionconf\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.716149 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0436afba-d4b2-47d8-ac4d-c621e029333d-ring-data-devices\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.716188 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-swiftconf\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.716259 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-combined-ca-bundle\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: E1208 09:22:11.716361 4776 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 08 09:22:11 crc kubenswrapper[4776]: E1208 09:22:11.716396 4776 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 08 09:22:11 crc kubenswrapper[4776]: E1208 09:22:11.716461 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift podName:cb640491-a8e7-4f8d-b4bb-1d0124f5727f nodeName:}" failed. No retries permitted until 2025-12-08 09:22:12.716436151 +0000 UTC m=+1408.979661173 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift") pod "swift-storage-0" (UID: "cb640491-a8e7-4f8d-b4bb-1d0124f5727f") : configmap "swift-ring-files" not found Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.716523 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0436afba-d4b2-47d8-ac4d-c621e029333d-etc-swift\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.717034 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0436afba-d4b2-47d8-ac4d-c621e029333d-scripts\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.717054 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0436afba-d4b2-47d8-ac4d-c621e029333d-ring-data-devices\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.721601 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-combined-ca-bundle\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.723150 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-dispersionconf\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.723500 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-swiftconf\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.732770 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqmh7\" (UniqueName: \"kubernetes.io/projected/0436afba-d4b2-47d8-ac4d-c621e029333d-kube-api-access-qqmh7\") pod \"swift-ring-rebalance-mmp8z\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:11 crc kubenswrapper[4776]: I1208 09:22:11.888219 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.200673 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5ss7n" event={"ID":"8054440d-20b3-498d-80a6-da7ee23c9864","Type":"ContainerStarted","Data":"8988c6790b9c2597ec460d682361880bb2dc95081eec3e20e7d055bdf75ec56f"} Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.201759 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.202537 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.289748 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-5ss7n" podStartSLOduration=3.289720693 podStartE2EDuration="3.289720693s" podCreationTimestamp="2025-12-08 09:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:22:12.22287107 +0000 UTC m=+1408.486096112" watchObservedRunningTime="2025-12-08 09:22:12.289720693 +0000 UTC m=+1408.552945715" Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.428706 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mmp8z"] Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.716621 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xm2wh" Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.733809 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d26a-account-create-update-wr6lj" Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.745132 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a94fb6fb-5b7e-4034-a8e4-9c40e269409e-operator-scripts\") pod \"a94fb6fb-5b7e-4034-a8e4-9c40e269409e\" (UID: \"a94fb6fb-5b7e-4034-a8e4-9c40e269409e\") " Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.745334 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf7md\" (UniqueName: \"kubernetes.io/projected/a94fb6fb-5b7e-4034-a8e4-9c40e269409e-kube-api-access-gf7md\") pod \"a94fb6fb-5b7e-4034-a8e4-9c40e269409e\" (UID: \"a94fb6fb-5b7e-4034-a8e4-9c40e269409e\") " Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.745961 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a94fb6fb-5b7e-4034-a8e4-9c40e269409e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a94fb6fb-5b7e-4034-a8e4-9c40e269409e" (UID: "a94fb6fb-5b7e-4034-a8e4-9c40e269409e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.746000 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.746109 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a94fb6fb-5b7e-4034-a8e4-9c40e269409e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:12 crc kubenswrapper[4776]: E1208 09:22:12.746150 4776 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 08 09:22:12 crc kubenswrapper[4776]: E1208 09:22:12.746164 4776 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 08 09:22:12 crc kubenswrapper[4776]: E1208 09:22:12.746220 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift podName:cb640491-a8e7-4f8d-b4bb-1d0124f5727f nodeName:}" failed. No retries permitted until 2025-12-08 09:22:14.746206301 +0000 UTC m=+1411.009431323 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift") pod "swift-storage-0" (UID: "cb640491-a8e7-4f8d-b4bb-1d0124f5727f") : configmap "swift-ring-files" not found Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.753317 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a94fb6fb-5b7e-4034-a8e4-9c40e269409e-kube-api-access-gf7md" (OuterVolumeSpecName: "kube-api-access-gf7md") pod "a94fb6fb-5b7e-4034-a8e4-9c40e269409e" (UID: "a94fb6fb-5b7e-4034-a8e4-9c40e269409e"). InnerVolumeSpecName "kube-api-access-gf7md". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.847706 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0571661-99e6-43e0-b2ea-1924e5437a7f-operator-scripts\") pod \"e0571661-99e6-43e0-b2ea-1924e5437a7f\" (UID: \"e0571661-99e6-43e0-b2ea-1924e5437a7f\") " Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.848282 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlds4\" (UniqueName: \"kubernetes.io/projected/e0571661-99e6-43e0-b2ea-1924e5437a7f-kube-api-access-zlds4\") pod \"e0571661-99e6-43e0-b2ea-1924e5437a7f\" (UID: \"e0571661-99e6-43e0-b2ea-1924e5437a7f\") " Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.848693 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0571661-99e6-43e0-b2ea-1924e5437a7f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0571661-99e6-43e0-b2ea-1924e5437a7f" (UID: "e0571661-99e6-43e0-b2ea-1924e5437a7f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.848851 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf7md\" (UniqueName: \"kubernetes.io/projected/a94fb6fb-5b7e-4034-a8e4-9c40e269409e-kube-api-access-gf7md\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.848880 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0571661-99e6-43e0-b2ea-1924e5437a7f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.851701 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0571661-99e6-43e0-b2ea-1924e5437a7f-kube-api-access-zlds4" (OuterVolumeSpecName: "kube-api-access-zlds4") pod "e0571661-99e6-43e0-b2ea-1924e5437a7f" (UID: "e0571661-99e6-43e0-b2ea-1924e5437a7f"). InnerVolumeSpecName "kube-api-access-zlds4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:12 crc kubenswrapper[4776]: I1208 09:22:12.952238 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlds4\" (UniqueName: \"kubernetes.io/projected/e0571661-99e6-43e0-b2ea-1924e5437a7f-kube-api-access-zlds4\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:13 crc kubenswrapper[4776]: I1208 09:22:13.211012 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d26a-account-create-update-wr6lj" event={"ID":"e0571661-99e6-43e0-b2ea-1924e5437a7f","Type":"ContainerDied","Data":"d225ced0c32711ea69aa91bff616204f69f061971afddfb295d614bc039aecb9"} Dec 08 09:22:13 crc kubenswrapper[4776]: I1208 09:22:13.211058 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d225ced0c32711ea69aa91bff616204f69f061971afddfb295d614bc039aecb9" Dec 08 09:22:13 crc kubenswrapper[4776]: I1208 09:22:13.211125 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d26a-account-create-update-wr6lj" Dec 08 09:22:13 crc kubenswrapper[4776]: I1208 09:22:13.220329 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mmp8z" event={"ID":"0436afba-d4b2-47d8-ac4d-c621e029333d","Type":"ContainerStarted","Data":"589cb69edd0cc7bd908a1088128edb1e7acd73e7f75a8f700aa5d167d6ff36fb"} Dec 08 09:22:13 crc kubenswrapper[4776]: I1208 09:22:13.221771 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xm2wh" event={"ID":"a94fb6fb-5b7e-4034-a8e4-9c40e269409e","Type":"ContainerDied","Data":"f766ddc6ad6315adeb2eb425d90cbee94cfbc89f3dd801fa09cdf67491483dd9"} Dec 08 09:22:13 crc kubenswrapper[4776]: I1208 09:22:13.221813 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f766ddc6ad6315adeb2eb425d90cbee94cfbc89f3dd801fa09cdf67491483dd9" Dec 08 09:22:13 crc kubenswrapper[4776]: I1208 09:22:13.221943 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xm2wh" Dec 08 09:22:13 crc kubenswrapper[4776]: I1208 09:22:13.236990 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:22:13 crc kubenswrapper[4776]: I1208 09:22:13.299158 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:22:13 crc kubenswrapper[4776]: I1208 09:22:13.475850 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w8l8r"] Dec 08 09:22:14 crc kubenswrapper[4776]: I1208 09:22:14.806307 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:14 crc kubenswrapper[4776]: E1208 09:22:14.806536 4776 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 08 09:22:14 crc kubenswrapper[4776]: E1208 09:22:14.806787 4776 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 08 09:22:14 crc kubenswrapper[4776]: E1208 09:22:14.806863 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift podName:cb640491-a8e7-4f8d-b4bb-1d0124f5727f nodeName:}" failed. No retries permitted until 2025-12-08 09:22:18.80682582 +0000 UTC m=+1415.070050832 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift") pod "swift-storage-0" (UID: "cb640491-a8e7-4f8d-b4bb-1d0124f5727f") : configmap "swift-ring-files" not found Dec 08 09:22:14 crc kubenswrapper[4776]: I1208 09:22:14.850423 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:22:14 crc kubenswrapper[4776]: I1208 09:22:14.979828 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.139357 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.210203 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5tfbk" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.248618 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w8l8r" podUID="8fc47007-7b1d-458c-b1ee-f561fff88bd7" containerName="registry-server" containerID="cri-o://60f3369aeb65ec7d79d738bd4cc339f62dfff7705f56cba17ee799e904726704" gracePeriod=2 Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.248714 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="786c0b37-638a-4b59-b149-628d9ad828bc" containerName="prometheus" containerID="cri-o://a6ea82a0a0fdbce89463cbb259477af5f32226f11448cdce50567a591f2cc6f2" gracePeriod=600 Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.248809 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="786c0b37-638a-4b59-b149-628d9ad828bc" containerName="thanos-sidecar" containerID="cri-o://2ef939e0e3310016b12741d23ba72538c85bdced4c4ddb306d993e02317c1238" gracePeriod=600 Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.248844 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="786c0b37-638a-4b59-b149-628d9ad828bc" containerName="config-reloader" containerID="cri-o://d647656c5e94e4b5bcc9ec6ef7af6889f977f3689f9074bd4d8bab32d9fcc049" gracePeriod=600 Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.435741 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wpgmk-config-h6865"] Dec 08 09:22:15 crc kubenswrapper[4776]: E1208 09:22:15.436514 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0571661-99e6-43e0-b2ea-1924e5437a7f" containerName="mariadb-account-create-update" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.436532 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0571661-99e6-43e0-b2ea-1924e5437a7f" containerName="mariadb-account-create-update" Dec 08 09:22:15 crc kubenswrapper[4776]: E1208 09:22:15.436556 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94fb6fb-5b7e-4034-a8e4-9c40e269409e" containerName="mariadb-database-create" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.436562 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94fb6fb-5b7e-4034-a8e4-9c40e269409e" containerName="mariadb-database-create" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.436755 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0571661-99e6-43e0-b2ea-1924e5437a7f" containerName="mariadb-account-create-update" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.436771 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a94fb6fb-5b7e-4034-a8e4-9c40e269409e" containerName="mariadb-database-create" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.438214 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.440150 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.462480 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wpgmk-config-h6865"] Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.537438 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-log-ovn\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.537481 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-run-ovn\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.537703 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-run\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.537781 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26a171b2-4d8f-4596-a541-74e514aa25af-scripts\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.537801 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/26a171b2-4d8f-4596-a541-74e514aa25af-additional-scripts\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.537848 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnln4\" (UniqueName: \"kubernetes.io/projected/26a171b2-4d8f-4596-a541-74e514aa25af-kube-api-access-mnln4\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.639432 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-log-ovn\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.639482 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-run-ovn\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.639576 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-run\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.639601 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26a171b2-4d8f-4596-a541-74e514aa25af-scripts\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.639617 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/26a171b2-4d8f-4596-a541-74e514aa25af-additional-scripts\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.639641 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnln4\" (UniqueName: \"kubernetes.io/projected/26a171b2-4d8f-4596-a541-74e514aa25af-kube-api-access-mnln4\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.639817 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-run-ovn\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.639866 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-run\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.639945 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-log-ovn\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.640504 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/26a171b2-4d8f-4596-a541-74e514aa25af-additional-scripts\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.641806 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26a171b2-4d8f-4596-a541-74e514aa25af-scripts\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.662914 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnln4\" (UniqueName: \"kubernetes.io/projected/26a171b2-4d8f-4596-a541-74e514aa25af-kube-api-access-mnln4\") pod \"ovn-controller-wpgmk-config-h6865\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:15 crc kubenswrapper[4776]: I1208 09:22:15.757949 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:16 crc kubenswrapper[4776]: I1208 09:22:16.209453 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="786c0b37-638a-4b59-b149-628d9ad828bc" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.136:9090/-/ready\": dial tcp 10.217.0.136:9090: connect: connection refused" Dec 08 09:22:16 crc kubenswrapper[4776]: I1208 09:22:16.274780 4776 generic.go:334] "Generic (PLEG): container finished" podID="786c0b37-638a-4b59-b149-628d9ad828bc" containerID="2ef939e0e3310016b12741d23ba72538c85bdced4c4ddb306d993e02317c1238" exitCode=0 Dec 08 09:22:16 crc kubenswrapper[4776]: I1208 09:22:16.274826 4776 generic.go:334] "Generic (PLEG): container finished" podID="786c0b37-638a-4b59-b149-628d9ad828bc" containerID="d647656c5e94e4b5bcc9ec6ef7af6889f977f3689f9074bd4d8bab32d9fcc049" exitCode=0 Dec 08 09:22:16 crc kubenswrapper[4776]: I1208 09:22:16.274838 4776 generic.go:334] "Generic (PLEG): container finished" podID="786c0b37-638a-4b59-b149-628d9ad828bc" containerID="a6ea82a0a0fdbce89463cbb259477af5f32226f11448cdce50567a591f2cc6f2" exitCode=0 Dec 08 09:22:16 crc kubenswrapper[4776]: I1208 09:22:16.274867 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"786c0b37-638a-4b59-b149-628d9ad828bc","Type":"ContainerDied","Data":"2ef939e0e3310016b12741d23ba72538c85bdced4c4ddb306d993e02317c1238"} Dec 08 09:22:16 crc kubenswrapper[4776]: I1208 09:22:16.274919 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"786c0b37-638a-4b59-b149-628d9ad828bc","Type":"ContainerDied","Data":"d647656c5e94e4b5bcc9ec6ef7af6889f977f3689f9074bd4d8bab32d9fcc049"} Dec 08 09:22:16 crc kubenswrapper[4776]: I1208 09:22:16.274930 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"786c0b37-638a-4b59-b149-628d9ad828bc","Type":"ContainerDied","Data":"a6ea82a0a0fdbce89463cbb259477af5f32226f11448cdce50567a591f2cc6f2"} Dec 08 09:22:16 crc kubenswrapper[4776]: I1208 09:22:16.277533 4776 generic.go:334] "Generic (PLEG): container finished" podID="8fc47007-7b1d-458c-b1ee-f561fff88bd7" containerID="60f3369aeb65ec7d79d738bd4cc339f62dfff7705f56cba17ee799e904726704" exitCode=0 Dec 08 09:22:16 crc kubenswrapper[4776]: I1208 09:22:16.277561 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8l8r" event={"ID":"8fc47007-7b1d-458c-b1ee-f561fff88bd7","Type":"ContainerDied","Data":"60f3369aeb65ec7d79d738bd4cc339f62dfff7705f56cba17ee799e904726704"} Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.289596 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mmp8z" event={"ID":"0436afba-d4b2-47d8-ac4d-c621e029333d","Type":"ContainerStarted","Data":"560cd3c0ea1c2892fc88c94dc8d330a36bceef5699defa1744ac58c98215b725"} Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.298331 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8l8r" event={"ID":"8fc47007-7b1d-458c-b1ee-f561fff88bd7","Type":"ContainerDied","Data":"f6b0583fa8f49dd932079f7c7ffd124c5e663077f826e6ebd2d7a5cc2cc5f0fe"} Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.298371 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6b0583fa8f49dd932079f7c7ffd124c5e663077f826e6ebd2d7a5cc2cc5f0fe" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.367394 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mmp8z" podStartSLOduration=2.179922198 podStartE2EDuration="6.367380723s" podCreationTimestamp="2025-12-08 09:22:11 +0000 UTC" firstStartedPulling="2025-12-08 09:22:12.445517933 +0000 UTC m=+1408.708742955" lastFinishedPulling="2025-12-08 09:22:16.632976458 +0000 UTC m=+1412.896201480" observedRunningTime="2025-12-08 09:22:17.366344026 +0000 UTC m=+1413.629569048" watchObservedRunningTime="2025-12-08 09:22:17.367380723 +0000 UTC m=+1413.630605745" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.381965 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.426300 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.451355 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-a752-account-create-update-c65jn"] Dec 08 09:22:17 crc kubenswrapper[4776]: E1208 09:22:17.451901 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="786c0b37-638a-4b59-b149-628d9ad828bc" containerName="prometheus" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.451922 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="786c0b37-638a-4b59-b149-628d9ad828bc" containerName="prometheus" Dec 08 09:22:17 crc kubenswrapper[4776]: E1208 09:22:17.451934 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc47007-7b1d-458c-b1ee-f561fff88bd7" containerName="extract-utilities" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.451941 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc47007-7b1d-458c-b1ee-f561fff88bd7" containerName="extract-utilities" Dec 08 09:22:17 crc kubenswrapper[4776]: E1208 09:22:17.451958 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="786c0b37-638a-4b59-b149-628d9ad828bc" containerName="init-config-reloader" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.451964 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="786c0b37-638a-4b59-b149-628d9ad828bc" containerName="init-config-reloader" Dec 08 09:22:17 crc kubenswrapper[4776]: E1208 09:22:17.451975 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc47007-7b1d-458c-b1ee-f561fff88bd7" containerName="extract-content" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.451987 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc47007-7b1d-458c-b1ee-f561fff88bd7" containerName="extract-content" Dec 08 09:22:17 crc kubenswrapper[4776]: E1208 09:22:17.452007 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc47007-7b1d-458c-b1ee-f561fff88bd7" containerName="registry-server" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.452013 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc47007-7b1d-458c-b1ee-f561fff88bd7" containerName="registry-server" Dec 08 09:22:17 crc kubenswrapper[4776]: E1208 09:22:17.452026 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="786c0b37-638a-4b59-b149-628d9ad828bc" containerName="thanos-sidecar" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.452038 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="786c0b37-638a-4b59-b149-628d9ad828bc" containerName="thanos-sidecar" Dec 08 09:22:17 crc kubenswrapper[4776]: E1208 09:22:17.452053 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="786c0b37-638a-4b59-b149-628d9ad828bc" containerName="config-reloader" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.452059 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="786c0b37-638a-4b59-b149-628d9ad828bc" containerName="config-reloader" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.452270 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="786c0b37-638a-4b59-b149-628d9ad828bc" containerName="prometheus" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.452312 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc47007-7b1d-458c-b1ee-f561fff88bd7" containerName="registry-server" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.452322 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="786c0b37-638a-4b59-b149-628d9ad828bc" containerName="thanos-sidecar" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.452331 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="786c0b37-638a-4b59-b149-628d9ad828bc" containerName="config-reloader" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.453185 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a752-account-create-update-c65jn" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.459044 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.481390 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc47007-7b1d-458c-b1ee-f561fff88bd7-catalog-content\") pod \"8fc47007-7b1d-458c-b1ee-f561fff88bd7\" (UID: \"8fc47007-7b1d-458c-b1ee-f561fff88bd7\") " Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.481487 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc47007-7b1d-458c-b1ee-f561fff88bd7-utilities\") pod \"8fc47007-7b1d-458c-b1ee-f561fff88bd7\" (UID: \"8fc47007-7b1d-458c-b1ee-f561fff88bd7\") " Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.481771 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srklq\" (UniqueName: \"kubernetes.io/projected/8fc47007-7b1d-458c-b1ee-f561fff88bd7-kube-api-access-srklq\") pod \"8fc47007-7b1d-458c-b1ee-f561fff88bd7\" (UID: \"8fc47007-7b1d-458c-b1ee-f561fff88bd7\") " Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.484298 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fc47007-7b1d-458c-b1ee-f561fff88bd7-utilities" (OuterVolumeSpecName: "utilities") pod "8fc47007-7b1d-458c-b1ee-f561fff88bd7" (UID: "8fc47007-7b1d-458c-b1ee-f561fff88bd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.499762 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc47007-7b1d-458c-b1ee-f561fff88bd7-kube-api-access-srklq" (OuterVolumeSpecName: "kube-api-access-srklq") pod "8fc47007-7b1d-458c-b1ee-f561fff88bd7" (UID: "8fc47007-7b1d-458c-b1ee-f561fff88bd7"). InnerVolumeSpecName "kube-api-access-srklq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:17 crc kubenswrapper[4776]: W1208 09:22:17.502892 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26a171b2_4d8f_4596_a541_74e514aa25af.slice/crio-8477862713ae906d6d7c03752baccc32f4eb56635100a04512de1e126030f1af WatchSource:0}: Error finding container 8477862713ae906d6d7c03752baccc32f4eb56635100a04512de1e126030f1af: Status 404 returned error can't find the container with id 8477862713ae906d6d7c03752baccc32f4eb56635100a04512de1e126030f1af Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.526613 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-a752-account-create-update-c65jn"] Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.582059 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wpgmk-config-h6865"] Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.583127 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-config\") pod \"786c0b37-638a-4b59-b149-628d9ad828bc\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.583252 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f96j4\" (UniqueName: \"kubernetes.io/projected/786c0b37-638a-4b59-b149-628d9ad828bc-kube-api-access-f96j4\") pod \"786c0b37-638a-4b59-b149-628d9ad828bc\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.583350 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/786c0b37-638a-4b59-b149-628d9ad828bc-prometheus-metric-storage-rulefiles-0\") pod \"786c0b37-638a-4b59-b149-628d9ad828bc\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.583413 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/786c0b37-638a-4b59-b149-628d9ad828bc-tls-assets\") pod \"786c0b37-638a-4b59-b149-628d9ad828bc\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.583479 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-web-config\") pod \"786c0b37-638a-4b59-b149-628d9ad828bc\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.583524 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/786c0b37-638a-4b59-b149-628d9ad828bc-config-out\") pod \"786c0b37-638a-4b59-b149-628d9ad828bc\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.583580 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"786c0b37-638a-4b59-b149-628d9ad828bc\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.583654 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-thanos-prometheus-http-client-file\") pod \"786c0b37-638a-4b59-b149-628d9ad828bc\" (UID: \"786c0b37-638a-4b59-b149-628d9ad828bc\") " Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.583941 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de008089-a913-4a62-85e1-f0ec597514ab-operator-scripts\") pod \"heat-a752-account-create-update-c65jn\" (UID: \"de008089-a913-4a62-85e1-f0ec597514ab\") " pod="openstack/heat-a752-account-create-update-c65jn" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.584070 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvqkd\" (UniqueName: \"kubernetes.io/projected/de008089-a913-4a62-85e1-f0ec597514ab-kube-api-access-dvqkd\") pod \"heat-a752-account-create-update-c65jn\" (UID: \"de008089-a913-4a62-85e1-f0ec597514ab\") " pod="openstack/heat-a752-account-create-update-c65jn" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.584128 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srklq\" (UniqueName: \"kubernetes.io/projected/8fc47007-7b1d-458c-b1ee-f561fff88bd7-kube-api-access-srklq\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.584140 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc47007-7b1d-458c-b1ee-f561fff88bd7-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.584986 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/786c0b37-638a-4b59-b149-628d9ad828bc-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "786c0b37-638a-4b59-b149-628d9ad828bc" (UID: "786c0b37-638a-4b59-b149-628d9ad828bc"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.587207 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-config" (OuterVolumeSpecName: "config") pod "786c0b37-638a-4b59-b149-628d9ad828bc" (UID: "786c0b37-638a-4b59-b149-628d9ad828bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.600342 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/786c0b37-638a-4b59-b149-628d9ad828bc-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "786c0b37-638a-4b59-b149-628d9ad828bc" (UID: "786c0b37-638a-4b59-b149-628d9ad828bc"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.604713 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/786c0b37-638a-4b59-b149-628d9ad828bc-config-out" (OuterVolumeSpecName: "config-out") pod "786c0b37-638a-4b59-b149-628d9ad828bc" (UID: "786c0b37-638a-4b59-b149-628d9ad828bc"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.606323 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "786c0b37-638a-4b59-b149-628d9ad828bc" (UID: "786c0b37-638a-4b59-b149-628d9ad828bc"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.611375 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "786c0b37-638a-4b59-b149-628d9ad828bc" (UID: "786c0b37-638a-4b59-b149-628d9ad828bc"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.614943 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fc47007-7b1d-458c-b1ee-f561fff88bd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fc47007-7b1d-458c-b1ee-f561fff88bd7" (UID: "8fc47007-7b1d-458c-b1ee-f561fff88bd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.619451 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/786c0b37-638a-4b59-b149-628d9ad828bc-kube-api-access-f96j4" (OuterVolumeSpecName: "kube-api-access-f96j4") pod "786c0b37-638a-4b59-b149-628d9ad828bc" (UID: "786c0b37-638a-4b59-b149-628d9ad828bc"). InnerVolumeSpecName "kube-api-access-f96j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.644837 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-wk8pc"] Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.646341 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-wk8pc" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.649260 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-web-config" (OuterVolumeSpecName: "web-config") pod "786c0b37-638a-4b59-b149-628d9ad828bc" (UID: "786c0b37-638a-4b59-b149-628d9ad828bc"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.662276 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-wk8pc"] Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.673310 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-gjfw9"] Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.674919 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gjfw9" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.686197 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de008089-a913-4a62-85e1-f0ec597514ab-operator-scripts\") pod \"heat-a752-account-create-update-c65jn\" (UID: \"de008089-a913-4a62-85e1-f0ec597514ab\") " pod="openstack/heat-a752-account-create-update-c65jn" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.686922 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvqkd\" (UniqueName: \"kubernetes.io/projected/de008089-a913-4a62-85e1-f0ec597514ab-kube-api-access-dvqkd\") pod \"heat-a752-account-create-update-c65jn\" (UID: \"de008089-a913-4a62-85e1-f0ec597514ab\") " pod="openstack/heat-a752-account-create-update-c65jn" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.687026 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.687086 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f96j4\" (UniqueName: \"kubernetes.io/projected/786c0b37-638a-4b59-b149-628d9ad828bc-kube-api-access-f96j4\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.687159 4776 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/786c0b37-638a-4b59-b149-628d9ad828bc-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.687233 4776 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/786c0b37-638a-4b59-b149-628d9ad828bc-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.687286 4776 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-web-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.687336 4776 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/786c0b37-638a-4b59-b149-628d9ad828bc-config-out\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.687396 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.687456 4776 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/786c0b37-638a-4b59-b149-628d9ad828bc-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.687512 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc47007-7b1d-458c-b1ee-f561fff88bd7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.702279 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de008089-a913-4a62-85e1-f0ec597514ab-operator-scripts\") pod \"heat-a752-account-create-update-c65jn\" (UID: \"de008089-a913-4a62-85e1-f0ec597514ab\") " pod="openstack/heat-a752-account-create-update-c65jn" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.702547 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gjfw9"] Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.741041 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvqkd\" (UniqueName: \"kubernetes.io/projected/de008089-a913-4a62-85e1-f0ec597514ab-kube-api-access-dvqkd\") pod \"heat-a752-account-create-update-c65jn\" (UID: \"de008089-a913-4a62-85e1-f0ec597514ab\") " pod="openstack/heat-a752-account-create-update-c65jn" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.745333 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.753989 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-hxwvw"] Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.755335 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-hxwvw" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.775020 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-hxwvw"] Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.792752 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dbd4182-75a0-42dd-97c8-a1cb8fee96f2-operator-scripts\") pod \"heat-db-create-wk8pc\" (UID: \"3dbd4182-75a0-42dd-97c8-a1cb8fee96f2\") " pod="openstack/heat-db-create-wk8pc" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.792830 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnl2q\" (UniqueName: \"kubernetes.io/projected/3dbd4182-75a0-42dd-97c8-a1cb8fee96f2-kube-api-access-cnl2q\") pod \"heat-db-create-wk8pc\" (UID: \"3dbd4182-75a0-42dd-97c8-a1cb8fee96f2\") " pod="openstack/heat-db-create-wk8pc" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.792876 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmfg7\" (UniqueName: \"kubernetes.io/projected/a862b37a-f96b-495a-8d8e-b3640d2f0609-kube-api-access-xmfg7\") pod \"cinder-db-create-gjfw9\" (UID: \"a862b37a-f96b-495a-8d8e-b3640d2f0609\") " pod="openstack/cinder-db-create-gjfw9" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.792900 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a862b37a-f96b-495a-8d8e-b3640d2f0609-operator-scripts\") pod \"cinder-db-create-gjfw9\" (UID: \"a862b37a-f96b-495a-8d8e-b3640d2f0609\") " pod="openstack/cinder-db-create-gjfw9" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.792989 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.808340 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a752-account-create-update-c65jn" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.808807 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f2b3-account-create-update-7jtlp"] Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.810446 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f2b3-account-create-update-7jtlp" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.812493 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.814907 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f2b3-account-create-update-7jtlp"] Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.843259 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b17b-account-create-update-2sxng"] Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.844638 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b17b-account-create-update-2sxng" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.854022 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.857383 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b17b-account-create-update-2sxng"] Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.877315 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-tz8ks"] Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.878617 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tz8ks" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.882140 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.882399 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.883368 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wz4bj" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.883555 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.896836 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dbd4182-75a0-42dd-97c8-a1cb8fee96f2-operator-scripts\") pod \"heat-db-create-wk8pc\" (UID: \"3dbd4182-75a0-42dd-97c8-a1cb8fee96f2\") " pod="openstack/heat-db-create-wk8pc" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.896878 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6677467f-5abc-4914-949d-bd6541aadeef-operator-scripts\") pod \"cinder-f2b3-account-create-update-7jtlp\" (UID: \"6677467f-5abc-4914-949d-bd6541aadeef\") " pod="openstack/cinder-f2b3-account-create-update-7jtlp" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.896947 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnl2q\" (UniqueName: \"kubernetes.io/projected/3dbd4182-75a0-42dd-97c8-a1cb8fee96f2-kube-api-access-cnl2q\") pod \"heat-db-create-wk8pc\" (UID: \"3dbd4182-75a0-42dd-97c8-a1cb8fee96f2\") " pod="openstack/heat-db-create-wk8pc" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.896987 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ffd8d26-fc44-453f-ad6b-9bce2b83252e-operator-scripts\") pod \"barbican-db-create-hxwvw\" (UID: \"7ffd8d26-fc44-453f-ad6b-9bce2b83252e\") " pod="openstack/barbican-db-create-hxwvw" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.897004 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khkfg\" (UniqueName: \"kubernetes.io/projected/7ffd8d26-fc44-453f-ad6b-9bce2b83252e-kube-api-access-khkfg\") pod \"barbican-db-create-hxwvw\" (UID: \"7ffd8d26-fc44-453f-ad6b-9bce2b83252e\") " pod="openstack/barbican-db-create-hxwvw" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.897028 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmfg7\" (UniqueName: \"kubernetes.io/projected/a862b37a-f96b-495a-8d8e-b3640d2f0609-kube-api-access-xmfg7\") pod \"cinder-db-create-gjfw9\" (UID: \"a862b37a-f96b-495a-8d8e-b3640d2f0609\") " pod="openstack/cinder-db-create-gjfw9" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.897057 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a862b37a-f96b-495a-8d8e-b3640d2f0609-operator-scripts\") pod \"cinder-db-create-gjfw9\" (UID: \"a862b37a-f96b-495a-8d8e-b3640d2f0609\") " pod="openstack/cinder-db-create-gjfw9" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.897107 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g99sf\" (UniqueName: \"kubernetes.io/projected/6677467f-5abc-4914-949d-bd6541aadeef-kube-api-access-g99sf\") pod \"cinder-f2b3-account-create-update-7jtlp\" (UID: \"6677467f-5abc-4914-949d-bd6541aadeef\") " pod="openstack/cinder-f2b3-account-create-update-7jtlp" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.898127 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a862b37a-f96b-495a-8d8e-b3640d2f0609-operator-scripts\") pod \"cinder-db-create-gjfw9\" (UID: \"a862b37a-f96b-495a-8d8e-b3640d2f0609\") " pod="openstack/cinder-db-create-gjfw9" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.917119 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dbd4182-75a0-42dd-97c8-a1cb8fee96f2-operator-scripts\") pod \"heat-db-create-wk8pc\" (UID: \"3dbd4182-75a0-42dd-97c8-a1cb8fee96f2\") " pod="openstack/heat-db-create-wk8pc" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.918895 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnl2q\" (UniqueName: \"kubernetes.io/projected/3dbd4182-75a0-42dd-97c8-a1cb8fee96f2-kube-api-access-cnl2q\") pod \"heat-db-create-wk8pc\" (UID: \"3dbd4182-75a0-42dd-97c8-a1cb8fee96f2\") " pod="openstack/heat-db-create-wk8pc" Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.922358 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-tz8ks"] Dec 08 09:22:17 crc kubenswrapper[4776]: I1208 09:22:17.922621 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmfg7\" (UniqueName: \"kubernetes.io/projected/a862b37a-f96b-495a-8d8e-b3640d2f0609-kube-api-access-xmfg7\") pod \"cinder-db-create-gjfw9\" (UID: \"a862b37a-f96b-495a-8d8e-b3640d2f0609\") " pod="openstack/cinder-db-create-gjfw9" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.001737 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ffd8d26-fc44-453f-ad6b-9bce2b83252e-operator-scripts\") pod \"barbican-db-create-hxwvw\" (UID: \"7ffd8d26-fc44-453f-ad6b-9bce2b83252e\") " pod="openstack/barbican-db-create-hxwvw" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.006129 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khkfg\" (UniqueName: \"kubernetes.io/projected/7ffd8d26-fc44-453f-ad6b-9bce2b83252e-kube-api-access-khkfg\") pod \"barbican-db-create-hxwvw\" (UID: \"7ffd8d26-fc44-453f-ad6b-9bce2b83252e\") " pod="openstack/barbican-db-create-hxwvw" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.006232 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1decbe-db5e-4910-9604-aca62ec47099-config-data\") pod \"keystone-db-sync-tz8ks\" (UID: \"1d1decbe-db5e-4910-9604-aca62ec47099\") " pod="openstack/keystone-db-sync-tz8ks" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.006421 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d68gm\" (UniqueName: \"kubernetes.io/projected/1d1decbe-db5e-4910-9604-aca62ec47099-kube-api-access-d68gm\") pod \"keystone-db-sync-tz8ks\" (UID: \"1d1decbe-db5e-4910-9604-aca62ec47099\") " pod="openstack/keystone-db-sync-tz8ks" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.006459 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g99sf\" (UniqueName: \"kubernetes.io/projected/6677467f-5abc-4914-949d-bd6541aadeef-kube-api-access-g99sf\") pod \"cinder-f2b3-account-create-update-7jtlp\" (UID: \"6677467f-5abc-4914-949d-bd6541aadeef\") " pod="openstack/cinder-f2b3-account-create-update-7jtlp" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.006520 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4811e2fe-5855-47ee-b742-ec6c481936a2-operator-scripts\") pod \"barbican-b17b-account-create-update-2sxng\" (UID: \"4811e2fe-5855-47ee-b742-ec6c481936a2\") " pod="openstack/barbican-b17b-account-create-update-2sxng" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.006560 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1decbe-db5e-4910-9604-aca62ec47099-combined-ca-bundle\") pod \"keystone-db-sync-tz8ks\" (UID: \"1d1decbe-db5e-4910-9604-aca62ec47099\") " pod="openstack/keystone-db-sync-tz8ks" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.006674 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6677467f-5abc-4914-949d-bd6541aadeef-operator-scripts\") pod \"cinder-f2b3-account-create-update-7jtlp\" (UID: \"6677467f-5abc-4914-949d-bd6541aadeef\") " pod="openstack/cinder-f2b3-account-create-update-7jtlp" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.006724 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pxtn\" (UniqueName: \"kubernetes.io/projected/4811e2fe-5855-47ee-b742-ec6c481936a2-kube-api-access-5pxtn\") pod \"barbican-b17b-account-create-update-2sxng\" (UID: \"4811e2fe-5855-47ee-b742-ec6c481936a2\") " pod="openstack/barbican-b17b-account-create-update-2sxng" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.007917 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ffd8d26-fc44-453f-ad6b-9bce2b83252e-operator-scripts\") pod \"barbican-db-create-hxwvw\" (UID: \"7ffd8d26-fc44-453f-ad6b-9bce2b83252e\") " pod="openstack/barbican-db-create-hxwvw" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.008004 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6677467f-5abc-4914-949d-bd6541aadeef-operator-scripts\") pod \"cinder-f2b3-account-create-update-7jtlp\" (UID: \"6677467f-5abc-4914-949d-bd6541aadeef\") " pod="openstack/cinder-f2b3-account-create-update-7jtlp" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.041933 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-wxkbx"] Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.044688 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wxkbx" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.070243 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wxkbx"] Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.075331 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g99sf\" (UniqueName: \"kubernetes.io/projected/6677467f-5abc-4914-949d-bd6541aadeef-kube-api-access-g99sf\") pod \"cinder-f2b3-account-create-update-7jtlp\" (UID: \"6677467f-5abc-4914-949d-bd6541aadeef\") " pod="openstack/cinder-f2b3-account-create-update-7jtlp" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.075532 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khkfg\" (UniqueName: \"kubernetes.io/projected/7ffd8d26-fc44-453f-ad6b-9bce2b83252e-kube-api-access-khkfg\") pod \"barbican-db-create-hxwvw\" (UID: \"7ffd8d26-fc44-453f-ad6b-9bce2b83252e\") " pod="openstack/barbican-db-create-hxwvw" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.084409 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-wk8pc" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.109243 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d68gm\" (UniqueName: \"kubernetes.io/projected/1d1decbe-db5e-4910-9604-aca62ec47099-kube-api-access-d68gm\") pod \"keystone-db-sync-tz8ks\" (UID: \"1d1decbe-db5e-4910-9604-aca62ec47099\") " pod="openstack/keystone-db-sync-tz8ks" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.109301 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4811e2fe-5855-47ee-b742-ec6c481936a2-operator-scripts\") pod \"barbican-b17b-account-create-update-2sxng\" (UID: \"4811e2fe-5855-47ee-b742-ec6c481936a2\") " pod="openstack/barbican-b17b-account-create-update-2sxng" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.109323 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1decbe-db5e-4910-9604-aca62ec47099-combined-ca-bundle\") pod \"keystone-db-sync-tz8ks\" (UID: \"1d1decbe-db5e-4910-9604-aca62ec47099\") " pod="openstack/keystone-db-sync-tz8ks" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.109374 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pxtn\" (UniqueName: \"kubernetes.io/projected/4811e2fe-5855-47ee-b742-ec6c481936a2-kube-api-access-5pxtn\") pod \"barbican-b17b-account-create-update-2sxng\" (UID: \"4811e2fe-5855-47ee-b742-ec6c481936a2\") " pod="openstack/barbican-b17b-account-create-update-2sxng" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.109443 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1decbe-db5e-4910-9604-aca62ec47099-config-data\") pod \"keystone-db-sync-tz8ks\" (UID: \"1d1decbe-db5e-4910-9604-aca62ec47099\") " pod="openstack/keystone-db-sync-tz8ks" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.113831 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gjfw9" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.114100 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4811e2fe-5855-47ee-b742-ec6c481936a2-operator-scripts\") pod \"barbican-b17b-account-create-update-2sxng\" (UID: \"4811e2fe-5855-47ee-b742-ec6c481936a2\") " pod="openstack/barbican-b17b-account-create-update-2sxng" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.115757 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-732a-account-create-update-mm924"] Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.117279 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-732a-account-create-update-mm924" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.132055 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.132448 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1decbe-db5e-4910-9604-aca62ec47099-combined-ca-bundle\") pod \"keystone-db-sync-tz8ks\" (UID: \"1d1decbe-db5e-4910-9604-aca62ec47099\") " pod="openstack/keystone-db-sync-tz8ks" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.152598 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-hxwvw" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.161343 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d68gm\" (UniqueName: \"kubernetes.io/projected/1d1decbe-db5e-4910-9604-aca62ec47099-kube-api-access-d68gm\") pod \"keystone-db-sync-tz8ks\" (UID: \"1d1decbe-db5e-4910-9604-aca62ec47099\") " pod="openstack/keystone-db-sync-tz8ks" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.161902 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1decbe-db5e-4910-9604-aca62ec47099-config-data\") pod \"keystone-db-sync-tz8ks\" (UID: \"1d1decbe-db5e-4910-9604-aca62ec47099\") " pod="openstack/keystone-db-sync-tz8ks" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.162424 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pxtn\" (UniqueName: \"kubernetes.io/projected/4811e2fe-5855-47ee-b742-ec6c481936a2-kube-api-access-5pxtn\") pod \"barbican-b17b-account-create-update-2sxng\" (UID: \"4811e2fe-5855-47ee-b742-ec6c481936a2\") " pod="openstack/barbican-b17b-account-create-update-2sxng" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.166089 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-732a-account-create-update-mm924"] Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.176648 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f2b3-account-create-update-7jtlp" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.217306 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54af9994-75b9-457a-8b67-5687e91d698a-operator-scripts\") pod \"neutron-732a-account-create-update-mm924\" (UID: \"54af9994-75b9-457a-8b67-5687e91d698a\") " pod="openstack/neutron-732a-account-create-update-mm924" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.217680 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd8cf615-20fc-42d9-bb77-cdeebbfcdb64-operator-scripts\") pod \"neutron-db-create-wxkbx\" (UID: \"dd8cf615-20fc-42d9-bb77-cdeebbfcdb64\") " pod="openstack/neutron-db-create-wxkbx" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.217795 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sxsd\" (UniqueName: \"kubernetes.io/projected/dd8cf615-20fc-42d9-bb77-cdeebbfcdb64-kube-api-access-7sxsd\") pod \"neutron-db-create-wxkbx\" (UID: \"dd8cf615-20fc-42d9-bb77-cdeebbfcdb64\") " pod="openstack/neutron-db-create-wxkbx" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.217970 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzm5m\" (UniqueName: \"kubernetes.io/projected/54af9994-75b9-457a-8b67-5687e91d698a-kube-api-access-gzm5m\") pod \"neutron-732a-account-create-update-mm924\" (UID: \"54af9994-75b9-457a-8b67-5687e91d698a\") " pod="openstack/neutron-732a-account-create-update-mm924" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.319690 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpgmk-config-h6865" event={"ID":"26a171b2-4d8f-4596-a541-74e514aa25af","Type":"ContainerStarted","Data":"b87dab899deb8f6d16bdbf6c9ef247958c47eecf16405da6fccc045cbf52b0d0"} Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.319736 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpgmk-config-h6865" event={"ID":"26a171b2-4d8f-4596-a541-74e514aa25af","Type":"ContainerStarted","Data":"8477862713ae906d6d7c03752baccc32f4eb56635100a04512de1e126030f1af"} Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.320878 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54af9994-75b9-457a-8b67-5687e91d698a-operator-scripts\") pod \"neutron-732a-account-create-update-mm924\" (UID: \"54af9994-75b9-457a-8b67-5687e91d698a\") " pod="openstack/neutron-732a-account-create-update-mm924" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.320929 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd8cf615-20fc-42d9-bb77-cdeebbfcdb64-operator-scripts\") pod \"neutron-db-create-wxkbx\" (UID: \"dd8cf615-20fc-42d9-bb77-cdeebbfcdb64\") " pod="openstack/neutron-db-create-wxkbx" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.320967 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sxsd\" (UniqueName: \"kubernetes.io/projected/dd8cf615-20fc-42d9-bb77-cdeebbfcdb64-kube-api-access-7sxsd\") pod \"neutron-db-create-wxkbx\" (UID: \"dd8cf615-20fc-42d9-bb77-cdeebbfcdb64\") " pod="openstack/neutron-db-create-wxkbx" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.321044 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzm5m\" (UniqueName: \"kubernetes.io/projected/54af9994-75b9-457a-8b67-5687e91d698a-kube-api-access-gzm5m\") pod \"neutron-732a-account-create-update-mm924\" (UID: \"54af9994-75b9-457a-8b67-5687e91d698a\") " pod="openstack/neutron-732a-account-create-update-mm924" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.321309 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tz8ks" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.321748 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b17b-account-create-update-2sxng" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.322099 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54af9994-75b9-457a-8b67-5687e91d698a-operator-scripts\") pod \"neutron-732a-account-create-update-mm924\" (UID: \"54af9994-75b9-457a-8b67-5687e91d698a\") " pod="openstack/neutron-732a-account-create-update-mm924" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.325254 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd8cf615-20fc-42d9-bb77-cdeebbfcdb64-operator-scripts\") pod \"neutron-db-create-wxkbx\" (UID: \"dd8cf615-20fc-42d9-bb77-cdeebbfcdb64\") " pod="openstack/neutron-db-create-wxkbx" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.344963 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wpgmk-config-h6865" podStartSLOduration=3.3449435530000002 podStartE2EDuration="3.344943553s" podCreationTimestamp="2025-12-08 09:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:22:18.340793421 +0000 UTC m=+1414.604018443" watchObservedRunningTime="2025-12-08 09:22:18.344943553 +0000 UTC m=+1414.608168565" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.362237 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzm5m\" (UniqueName: \"kubernetes.io/projected/54af9994-75b9-457a-8b67-5687e91d698a-kube-api-access-gzm5m\") pod \"neutron-732a-account-create-update-mm924\" (UID: \"54af9994-75b9-457a-8b67-5687e91d698a\") " pod="openstack/neutron-732a-account-create-update-mm924" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.364136 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sxsd\" (UniqueName: \"kubernetes.io/projected/dd8cf615-20fc-42d9-bb77-cdeebbfcdb64-kube-api-access-7sxsd\") pod \"neutron-db-create-wxkbx\" (UID: \"dd8cf615-20fc-42d9-bb77-cdeebbfcdb64\") " pod="openstack/neutron-db-create-wxkbx" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.386685 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8l8r" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.389074 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"786c0b37-638a-4b59-b149-628d9ad828bc","Type":"ContainerDied","Data":"b20b4c2bb3bcbc19107993d351b26cc2cb3321013049808d939c340dc34f8093"} Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.389129 4776 scope.go:117] "RemoveContainer" containerID="2ef939e0e3310016b12741d23ba72538c85bdced4c4ddb306d993e02317c1238" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.391743 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.398834 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wxkbx" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.454604 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-732a-account-create-update-mm924" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.464859 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w8l8r"] Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.480704 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w8l8r"] Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.491462 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.512367 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.528493 4776 scope.go:117] "RemoveContainer" containerID="d647656c5e94e4b5bcc9ec6ef7af6889f977f3689f9074bd4d8bab32d9fcc049" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.536414 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.542504 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.548710 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.548717 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-lplpr" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.548758 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.548831 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.548977 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.549035 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.556306 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.565893 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.596039 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-a752-account-create-update-c65jn"] Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.627710 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.627885 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-config\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.627976 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.628067 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8w2f\" (UniqueName: \"kubernetes.io/projected/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-kube-api-access-v8w2f\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.628244 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.628325 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.628393 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.628490 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.628624 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.628712 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.628858 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.639346 4776 scope.go:117] "RemoveContainer" containerID="a6ea82a0a0fdbce89463cbb259477af5f32226f11448cdce50567a591f2cc6f2" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.694635 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-wk8pc"] Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.731720 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.732012 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.732047 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.732069 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.732121 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.732141 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-config\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.732190 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.732231 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8w2f\" (UniqueName: \"kubernetes.io/projected/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-kube-api-access-v8w2f\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.732270 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.732295 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.732318 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.741997 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-config\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.744524 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.744823 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.746202 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.748787 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.749792 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.754747 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.758062 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.778457 4776 scope.go:117] "RemoveContainer" containerID="3eac7d79f6ff5424dfa03b1bdfb4a80f7008792eb0ac01a3399847952fb39221" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.779686 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.791669 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.806720 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8w2f\" (UniqueName: \"kubernetes.io/projected/95be142a-2a8f-4f5c-97e0-2e64e108fb8b-kube-api-access-v8w2f\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.833607 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: E1208 09:22:18.833848 4776 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 08 09:22:18 crc kubenswrapper[4776]: E1208 09:22:18.833861 4776 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 08 09:22:18 crc kubenswrapper[4776]: E1208 09:22:18.833901 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift podName:cb640491-a8e7-4f8d-b4bb-1d0124f5727f nodeName:}" failed. No retries permitted until 2025-12-08 09:22:26.833887531 +0000 UTC m=+1423.097112543 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift") pod "swift-storage-0" (UID: "cb640491-a8e7-4f8d-b4bb-1d0124f5727f") : configmap "swift-ring-files" not found Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.839836 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-9hw46"] Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.841358 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9hw46" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.857166 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.857519 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hszn2" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.859670 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9hw46"] Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.878324 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"prometheus-metric-storage-0\" (UID: \"95be142a-2a8f-4f5c-97e0-2e64e108fb8b\") " pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.936350 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-config-data\") pod \"glance-db-sync-9hw46\" (UID: \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\") " pod="openstack/glance-db-sync-9hw46" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.936474 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-combined-ca-bundle\") pod \"glance-db-sync-9hw46\" (UID: \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\") " pod="openstack/glance-db-sync-9hw46" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.936560 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-db-sync-config-data\") pod \"glance-db-sync-9hw46\" (UID: \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\") " pod="openstack/glance-db-sync-9hw46" Dec 08 09:22:18 crc kubenswrapper[4776]: I1208 09:22:18.936664 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sck9g\" (UniqueName: \"kubernetes.io/projected/4a920788-f8d6-4c42-84f6-d842d9bf9a17-kube-api-access-sck9g\") pod \"glance-db-sync-9hw46\" (UID: \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\") " pod="openstack/glance-db-sync-9hw46" Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.042384 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-config-data\") pod \"glance-db-sync-9hw46\" (UID: \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\") " pod="openstack/glance-db-sync-9hw46" Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.042694 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-combined-ca-bundle\") pod \"glance-db-sync-9hw46\" (UID: \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\") " pod="openstack/glance-db-sync-9hw46" Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.042721 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-db-sync-config-data\") pod \"glance-db-sync-9hw46\" (UID: \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\") " pod="openstack/glance-db-sync-9hw46" Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.042751 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sck9g\" (UniqueName: \"kubernetes.io/projected/4a920788-f8d6-4c42-84f6-d842d9bf9a17-kube-api-access-sck9g\") pod \"glance-db-sync-9hw46\" (UID: \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\") " pod="openstack/glance-db-sync-9hw46" Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.051465 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-combined-ca-bundle\") pod \"glance-db-sync-9hw46\" (UID: \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\") " pod="openstack/glance-db-sync-9hw46" Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.052056 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-db-sync-config-data\") pod \"glance-db-sync-9hw46\" (UID: \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\") " pod="openstack/glance-db-sync-9hw46" Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.055889 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-config-data\") pod \"glance-db-sync-9hw46\" (UID: \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\") " pod="openstack/glance-db-sync-9hw46" Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.065899 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sck9g\" (UniqueName: \"kubernetes.io/projected/4a920788-f8d6-4c42-84f6-d842d9bf9a17-kube-api-access-sck9g\") pod \"glance-db-sync-9hw46\" (UID: \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\") " pod="openstack/glance-db-sync-9hw46" Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.104066 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9hw46" Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.178767 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gjfw9"] Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.185048 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.436373 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a752-account-create-update-c65jn" event={"ID":"de008089-a913-4a62-85e1-f0ec597514ab","Type":"ContainerStarted","Data":"97fd9b78f2d6c01581bb30734f632888082b29198c56289c6fbd841a11b49eaa"} Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.446708 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gjfw9" event={"ID":"a862b37a-f96b-495a-8d8e-b3640d2f0609","Type":"ContainerStarted","Data":"5a7210732579613a341a247210755bae8e6f7b834ba12c99ad34f0047633a41b"} Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.452366 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-wk8pc" event={"ID":"3dbd4182-75a0-42dd-97c8-a1cb8fee96f2","Type":"ContainerStarted","Data":"961e427d177fd18f93264df8ac77d85ab8e45051a0265c2a83716c5a6d5bcd60"} Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.452704 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-wk8pc" event={"ID":"3dbd4182-75a0-42dd-97c8-a1cb8fee96f2","Type":"ContainerStarted","Data":"a3b11a17a8abcb80bb560a7a50f634b03a774ecb9f7e55a05a5e95e987304498"} Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.526350 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-wk8pc" podStartSLOduration=2.526326411 podStartE2EDuration="2.526326411s" podCreationTimestamp="2025-12-08 09:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:22:19.492138043 +0000 UTC m=+1415.755363065" watchObservedRunningTime="2025-12-08 09:22:19.526326411 +0000 UTC m=+1415.789551433" Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.618077 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-hxwvw"] Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.658204 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f2b3-account-create-update-7jtlp"] Dec 08 09:22:19 crc kubenswrapper[4776]: W1208 09:22:19.660345 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ffd8d26_fc44_453f_ad6b_9bce2b83252e.slice/crio-cd745b7166e6fc9557b815b66ab3c6392ffc90146bd9157bf1b97feb47b116d6 WatchSource:0}: Error finding container cd745b7166e6fc9557b815b66ab3c6392ffc90146bd9157bf1b97feb47b116d6: Status 404 returned error can't find the container with id cd745b7166e6fc9557b815b66ab3c6392ffc90146bd9157bf1b97feb47b116d6 Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.955232 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wxkbx"] Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.974328 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b17b-account-create-update-2sxng"] Dec 08 09:22:19 crc kubenswrapper[4776]: W1208 09:22:19.980517 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4811e2fe_5855_47ee_b742_ec6c481936a2.slice/crio-2d235beab8073fce81e692c23a21de61f194da25f7f1a50587470046bb56d48d WatchSource:0}: Error finding container 2d235beab8073fce81e692c23a21de61f194da25f7f1a50587470046bb56d48d: Status 404 returned error can't find the container with id 2d235beab8073fce81e692c23a21de61f194da25f7f1a50587470046bb56d48d Dec 08 09:22:19 crc kubenswrapper[4776]: I1208 09:22:19.991936 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-732a-account-create-update-mm924"] Dec 08 09:22:20 crc kubenswrapper[4776]: W1208 09:22:20.001542 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd8cf615_20fc_42d9_bb77_cdeebbfcdb64.slice/crio-9d2f63d7015922c540c5c059f3dd3be83388a6acd3f81749036a0fe11f4030cc WatchSource:0}: Error finding container 9d2f63d7015922c540c5c059f3dd3be83388a6acd3f81749036a0fe11f4030cc: Status 404 returned error can't find the container with id 9d2f63d7015922c540c5c059f3dd3be83388a6acd3f81749036a0fe11f4030cc Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.072388 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-tz8ks"] Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.271920 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.287693 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.395727 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="786c0b37-638a-4b59-b149-628d9ad828bc" path="/var/lib/kubelet/pods/786c0b37-638a-4b59-b149-628d9ad828bc/volumes" Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.397191 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fc47007-7b1d-458c-b1ee-f561fff88bd7" path="/var/lib/kubelet/pods/8fc47007-7b1d-458c-b1ee-f561fff88bd7/volumes" Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.399584 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kx9kd"] Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.400923 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" podUID="fd497a67-79f7-4a1a-b0de-eb6fdcc524ae" containerName="dnsmasq-dns" containerID="cri-o://1d1a164851d77a165e18b51c820ff4b17f201a02f4bdfec74fb329f03f7b767b" gracePeriod=10 Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.408911 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9hw46"] Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.467526 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tz8ks" event={"ID":"1d1decbe-db5e-4910-9604-aca62ec47099","Type":"ContainerStarted","Data":"6cc128b64f7b80a13f658b124395a4491001528aa60f0bd48e33ca3d824e36b3"} Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.473862 4776 generic.go:334] "Generic (PLEG): container finished" podID="6677467f-5abc-4914-949d-bd6541aadeef" containerID="6912c00417baf631d68734ffcb5f0478237df070eac8d6f5935f9907788219b2" exitCode=0 Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.473925 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f2b3-account-create-update-7jtlp" event={"ID":"6677467f-5abc-4914-949d-bd6541aadeef","Type":"ContainerDied","Data":"6912c00417baf631d68734ffcb5f0478237df070eac8d6f5935f9907788219b2"} Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.473947 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f2b3-account-create-update-7jtlp" event={"ID":"6677467f-5abc-4914-949d-bd6541aadeef","Type":"ContainerStarted","Data":"792688683f6f7fa22b3fb361ce26d925ecdeff97ce70ba997049570f37827582"} Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.480244 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wxkbx" event={"ID":"dd8cf615-20fc-42d9-bb77-cdeebbfcdb64","Type":"ContainerStarted","Data":"cc9a7cc6e55b19ccf3c2d1dc9540054478bcf15926f146c619b2f7febda98012"} Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.480288 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wxkbx" event={"ID":"dd8cf615-20fc-42d9-bb77-cdeebbfcdb64","Type":"ContainerStarted","Data":"9d2f63d7015922c540c5c059f3dd3be83388a6acd3f81749036a0fe11f4030cc"} Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.482238 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b17b-account-create-update-2sxng" event={"ID":"4811e2fe-5855-47ee-b742-ec6c481936a2","Type":"ContainerStarted","Data":"8c65a5eb0be9c3f321f7345766a15b2e8c3efebf6526552db0b83edf93856a8b"} Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.482263 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b17b-account-create-update-2sxng" event={"ID":"4811e2fe-5855-47ee-b742-ec6c481936a2","Type":"ContainerStarted","Data":"2d235beab8073fce81e692c23a21de61f194da25f7f1a50587470046bb56d48d"} Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.486634 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"95be142a-2a8f-4f5c-97e0-2e64e108fb8b","Type":"ContainerStarted","Data":"17f856e2bbfd2d7701da1fa57a3a7811333de1ffbdabb03467f3d970ad66a9c7"} Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.488254 4776 generic.go:334] "Generic (PLEG): container finished" podID="a862b37a-f96b-495a-8d8e-b3640d2f0609" containerID="3db8d71ba1e6f8c4cf863e874a05573315ee07ed461d095d5ea9928a00e6b73e" exitCode=0 Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.488295 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gjfw9" event={"ID":"a862b37a-f96b-495a-8d8e-b3640d2f0609","Type":"ContainerDied","Data":"3db8d71ba1e6f8c4cf863e874a05573315ee07ed461d095d5ea9928a00e6b73e"} Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.489940 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-732a-account-create-update-mm924" event={"ID":"54af9994-75b9-457a-8b67-5687e91d698a","Type":"ContainerStarted","Data":"bb58957decf1944799406833c849e43b523968c3344c72a1efe230f6ed07b9b5"} Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.490043 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-732a-account-create-update-mm924" event={"ID":"54af9994-75b9-457a-8b67-5687e91d698a","Type":"ContainerStarted","Data":"6a98c57e58170fe58d003496e7c6e8520a655de7642714273b485ac6a1fc323e"} Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.495491 4776 generic.go:334] "Generic (PLEG): container finished" podID="3dbd4182-75a0-42dd-97c8-a1cb8fee96f2" containerID="961e427d177fd18f93264df8ac77d85ab8e45051a0265c2a83716c5a6d5bcd60" exitCode=0 Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.495555 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-wk8pc" event={"ID":"3dbd4182-75a0-42dd-97c8-a1cb8fee96f2","Type":"ContainerDied","Data":"961e427d177fd18f93264df8ac77d85ab8e45051a0265c2a83716c5a6d5bcd60"} Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.509550 4776 generic.go:334] "Generic (PLEG): container finished" podID="7ffd8d26-fc44-453f-ad6b-9bce2b83252e" containerID="ed082df2e4827cb316173d4dc8d75bf7d652cf36cf29300dbeee48c9a9899bdc" exitCode=0 Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.509637 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-hxwvw" event={"ID":"7ffd8d26-fc44-453f-ad6b-9bce2b83252e","Type":"ContainerDied","Data":"ed082df2e4827cb316173d4dc8d75bf7d652cf36cf29300dbeee48c9a9899bdc"} Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.509662 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-hxwvw" event={"ID":"7ffd8d26-fc44-453f-ad6b-9bce2b83252e","Type":"ContainerStarted","Data":"cd745b7166e6fc9557b815b66ab3c6392ffc90146bd9157bf1b97feb47b116d6"} Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.516464 4776 generic.go:334] "Generic (PLEG): container finished" podID="26a171b2-4d8f-4596-a541-74e514aa25af" containerID="b87dab899deb8f6d16bdbf6c9ef247958c47eecf16405da6fccc045cbf52b0d0" exitCode=0 Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.516550 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpgmk-config-h6865" event={"ID":"26a171b2-4d8f-4596-a541-74e514aa25af","Type":"ContainerDied","Data":"b87dab899deb8f6d16bdbf6c9ef247958c47eecf16405da6fccc045cbf52b0d0"} Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.520916 4776 generic.go:334] "Generic (PLEG): container finished" podID="de008089-a913-4a62-85e1-f0ec597514ab" containerID="ca7a98e4014a1bb8047ff9843f80b3e17c9f36c0a0a3326dcb9f034987c8ab25" exitCode=0 Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.520968 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a752-account-create-update-c65jn" event={"ID":"de008089-a913-4a62-85e1-f0ec597514ab","Type":"ContainerDied","Data":"ca7a98e4014a1bb8047ff9843f80b3e17c9f36c0a0a3326dcb9f034987c8ab25"} Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.531984 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-wxkbx" podStartSLOduration=3.531962953 podStartE2EDuration="3.531962953s" podCreationTimestamp="2025-12-08 09:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:22:20.517734961 +0000 UTC m=+1416.780959983" watchObservedRunningTime="2025-12-08 09:22:20.531962953 +0000 UTC m=+1416.795187975" Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.551671 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-732a-account-create-update-mm924" podStartSLOduration=3.551649121 podStartE2EDuration="3.551649121s" podCreationTimestamp="2025-12-08 09:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:22:20.543342458 +0000 UTC m=+1416.806567470" watchObservedRunningTime="2025-12-08 09:22:20.551649121 +0000 UTC m=+1416.814874143" Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.616251 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-b17b-account-create-update-2sxng" podStartSLOduration=3.616231084 podStartE2EDuration="3.616231084s" podCreationTimestamp="2025-12-08 09:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:22:20.583477335 +0000 UTC m=+1416.846702367" watchObservedRunningTime="2025-12-08 09:22:20.616231084 +0000 UTC m=+1416.879456106" Dec 08 09:22:20 crc kubenswrapper[4776]: I1208 09:22:20.835387 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 08 09:22:21 crc kubenswrapper[4776]: E1208 09:22:21.006242 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54af9994_75b9_457a_8b67_5687e91d698a.slice/crio-bb58957decf1944799406833c849e43b523968c3344c72a1efe230f6ed07b9b5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd8cf615_20fc_42d9_bb77_cdeebbfcdb64.slice/crio-conmon-cc9a7cc6e55b19ccf3c2d1dc9540054478bcf15926f146c619b2f7febda98012.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd8cf615_20fc_42d9_bb77_cdeebbfcdb64.slice/crio-cc9a7cc6e55b19ccf3c2d1dc9540054478bcf15926f146c619b2f7febda98012.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54af9994_75b9_457a_8b67_5687e91d698a.slice/crio-conmon-bb58957decf1944799406833c849e43b523968c3344c72a1efe230f6ed07b9b5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4811e2fe_5855_47ee_b742_ec6c481936a2.slice/crio-conmon-8c65a5eb0be9c3f321f7345766a15b2e8c3efebf6526552db0b83edf93856a8b.scope\": RecentStats: unable to find data in memory cache]" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.061807 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.127053 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-dns-svc\") pod \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.127113 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5znql\" (UniqueName: \"kubernetes.io/projected/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-kube-api-access-5znql\") pod \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.127246 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-ovsdbserver-nb\") pod \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.127296 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-ovsdbserver-sb\") pod \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.127341 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-config\") pod \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\" (UID: \"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae\") " Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.133223 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-kube-api-access-5znql" (OuterVolumeSpecName: "kube-api-access-5znql") pod "fd497a67-79f7-4a1a-b0de-eb6fdcc524ae" (UID: "fd497a67-79f7-4a1a-b0de-eb6fdcc524ae"). InnerVolumeSpecName "kube-api-access-5znql". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.179270 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fd497a67-79f7-4a1a-b0de-eb6fdcc524ae" (UID: "fd497a67-79f7-4a1a-b0de-eb6fdcc524ae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.181745 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fd497a67-79f7-4a1a-b0de-eb6fdcc524ae" (UID: "fd497a67-79f7-4a1a-b0de-eb6fdcc524ae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.184068 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fd497a67-79f7-4a1a-b0de-eb6fdcc524ae" (UID: "fd497a67-79f7-4a1a-b0de-eb6fdcc524ae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.188462 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-config" (OuterVolumeSpecName: "config") pod "fd497a67-79f7-4a1a-b0de-eb6fdcc524ae" (UID: "fd497a67-79f7-4a1a-b0de-eb6fdcc524ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.231019 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.231215 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5znql\" (UniqueName: \"kubernetes.io/projected/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-kube-api-access-5znql\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.231279 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.231332 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.231384 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.538697 4776 generic.go:334] "Generic (PLEG): container finished" podID="fd497a67-79f7-4a1a-b0de-eb6fdcc524ae" containerID="1d1a164851d77a165e18b51c820ff4b17f201a02f4bdfec74fb329f03f7b767b" exitCode=0 Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.538767 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" event={"ID":"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae","Type":"ContainerDied","Data":"1d1a164851d77a165e18b51c820ff4b17f201a02f4bdfec74fb329f03f7b767b"} Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.538796 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" event={"ID":"fd497a67-79f7-4a1a-b0de-eb6fdcc524ae","Type":"ContainerDied","Data":"281f530ea9c8c6457ebf54e535b09e5e50a73d483f1bc522592749c7c42280bf"} Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.538812 4776 scope.go:117] "RemoveContainer" containerID="1d1a164851d77a165e18b51c820ff4b17f201a02f4bdfec74fb329f03f7b767b" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.538814 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-kx9kd" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.543926 4776 generic.go:334] "Generic (PLEG): container finished" podID="dd8cf615-20fc-42d9-bb77-cdeebbfcdb64" containerID="cc9a7cc6e55b19ccf3c2d1dc9540054478bcf15926f146c619b2f7febda98012" exitCode=0 Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.543987 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wxkbx" event={"ID":"dd8cf615-20fc-42d9-bb77-cdeebbfcdb64","Type":"ContainerDied","Data":"cc9a7cc6e55b19ccf3c2d1dc9540054478bcf15926f146c619b2f7febda98012"} Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.547277 4776 generic.go:334] "Generic (PLEG): container finished" podID="4811e2fe-5855-47ee-b742-ec6c481936a2" containerID="8c65a5eb0be9c3f321f7345766a15b2e8c3efebf6526552db0b83edf93856a8b" exitCode=0 Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.547346 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b17b-account-create-update-2sxng" event={"ID":"4811e2fe-5855-47ee-b742-ec6c481936a2","Type":"ContainerDied","Data":"8c65a5eb0be9c3f321f7345766a15b2e8c3efebf6526552db0b83edf93856a8b"} Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.549082 4776 generic.go:334] "Generic (PLEG): container finished" podID="54af9994-75b9-457a-8b67-5687e91d698a" containerID="bb58957decf1944799406833c849e43b523968c3344c72a1efe230f6ed07b9b5" exitCode=0 Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.549160 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-732a-account-create-update-mm924" event={"ID":"54af9994-75b9-457a-8b67-5687e91d698a","Type":"ContainerDied","Data":"bb58957decf1944799406833c849e43b523968c3344c72a1efe230f6ed07b9b5"} Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.553843 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9hw46" event={"ID":"4a920788-f8d6-4c42-84f6-d842d9bf9a17","Type":"ContainerStarted","Data":"e8b21522402e354426ffa34371a20fa99804cfef89f8cc8231177b02e71230f3"} Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.588564 4776 scope.go:117] "RemoveContainer" containerID="cd7f76bf86baaaf8da852eee2fe0fbc3f3b36843c6569306fa8aafd612e4fe45" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.636286 4776 scope.go:117] "RemoveContainer" containerID="1d1a164851d77a165e18b51c820ff4b17f201a02f4bdfec74fb329f03f7b767b" Dec 08 09:22:21 crc kubenswrapper[4776]: E1208 09:22:21.645022 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1a164851d77a165e18b51c820ff4b17f201a02f4bdfec74fb329f03f7b767b\": container with ID starting with 1d1a164851d77a165e18b51c820ff4b17f201a02f4bdfec74fb329f03f7b767b not found: ID does not exist" containerID="1d1a164851d77a165e18b51c820ff4b17f201a02f4bdfec74fb329f03f7b767b" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.645066 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1a164851d77a165e18b51c820ff4b17f201a02f4bdfec74fb329f03f7b767b"} err="failed to get container status \"1d1a164851d77a165e18b51c820ff4b17f201a02f4bdfec74fb329f03f7b767b\": rpc error: code = NotFound desc = could not find container \"1d1a164851d77a165e18b51c820ff4b17f201a02f4bdfec74fb329f03f7b767b\": container with ID starting with 1d1a164851d77a165e18b51c820ff4b17f201a02f4bdfec74fb329f03f7b767b not found: ID does not exist" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.645092 4776 scope.go:117] "RemoveContainer" containerID="cd7f76bf86baaaf8da852eee2fe0fbc3f3b36843c6569306fa8aafd612e4fe45" Dec 08 09:22:21 crc kubenswrapper[4776]: E1208 09:22:21.647522 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd7f76bf86baaaf8da852eee2fe0fbc3f3b36843c6569306fa8aafd612e4fe45\": container with ID starting with cd7f76bf86baaaf8da852eee2fe0fbc3f3b36843c6569306fa8aafd612e4fe45 not found: ID does not exist" containerID="cd7f76bf86baaaf8da852eee2fe0fbc3f3b36843c6569306fa8aafd612e4fe45" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.647550 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd7f76bf86baaaf8da852eee2fe0fbc3f3b36843c6569306fa8aafd612e4fe45"} err="failed to get container status \"cd7f76bf86baaaf8da852eee2fe0fbc3f3b36843c6569306fa8aafd612e4fe45\": rpc error: code = NotFound desc = could not find container \"cd7f76bf86baaaf8da852eee2fe0fbc3f3b36843c6569306fa8aafd612e4fe45\": container with ID starting with cd7f76bf86baaaf8da852eee2fe0fbc3f3b36843c6569306fa8aafd612e4fe45 not found: ID does not exist" Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.662850 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kx9kd"] Dec 08 09:22:21 crc kubenswrapper[4776]: I1208 09:22:21.667433 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kx9kd"] Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.171306 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gjfw9" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.261303 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmfg7\" (UniqueName: \"kubernetes.io/projected/a862b37a-f96b-495a-8d8e-b3640d2f0609-kube-api-access-xmfg7\") pod \"a862b37a-f96b-495a-8d8e-b3640d2f0609\" (UID: \"a862b37a-f96b-495a-8d8e-b3640d2f0609\") " Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.261712 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a862b37a-f96b-495a-8d8e-b3640d2f0609-operator-scripts\") pod \"a862b37a-f96b-495a-8d8e-b3640d2f0609\" (UID: \"a862b37a-f96b-495a-8d8e-b3640d2f0609\") " Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.262889 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a862b37a-f96b-495a-8d8e-b3640d2f0609-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a862b37a-f96b-495a-8d8e-b3640d2f0609" (UID: "a862b37a-f96b-495a-8d8e-b3640d2f0609"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.297250 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a862b37a-f96b-495a-8d8e-b3640d2f0609-kube-api-access-xmfg7" (OuterVolumeSpecName: "kube-api-access-xmfg7") pod "a862b37a-f96b-495a-8d8e-b3640d2f0609" (UID: "a862b37a-f96b-495a-8d8e-b3640d2f0609"). InnerVolumeSpecName "kube-api-access-xmfg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.364330 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a862b37a-f96b-495a-8d8e-b3640d2f0609-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.364363 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmfg7\" (UniqueName: \"kubernetes.io/projected/a862b37a-f96b-495a-8d8e-b3640d2f0609-kube-api-access-xmfg7\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.368372 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd497a67-79f7-4a1a-b0de-eb6fdcc524ae" path="/var/lib/kubelet/pods/fd497a67-79f7-4a1a-b0de-eb6fdcc524ae/volumes" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.391418 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-wk8pc" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.402986 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.417502 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a752-account-create-update-c65jn" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.435724 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-hxwvw" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.457229 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f2b3-account-create-update-7jtlp" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.465894 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnl2q\" (UniqueName: \"kubernetes.io/projected/3dbd4182-75a0-42dd-97c8-a1cb8fee96f2-kube-api-access-cnl2q\") pod \"3dbd4182-75a0-42dd-97c8-a1cb8fee96f2\" (UID: \"3dbd4182-75a0-42dd-97c8-a1cb8fee96f2\") " Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.465964 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnln4\" (UniqueName: \"kubernetes.io/projected/26a171b2-4d8f-4596-a541-74e514aa25af-kube-api-access-mnln4\") pod \"26a171b2-4d8f-4596-a541-74e514aa25af\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.466003 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-run\") pod \"26a171b2-4d8f-4596-a541-74e514aa25af\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.466028 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de008089-a913-4a62-85e1-f0ec597514ab-operator-scripts\") pod \"de008089-a913-4a62-85e1-f0ec597514ab\" (UID: \"de008089-a913-4a62-85e1-f0ec597514ab\") " Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.466044 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/26a171b2-4d8f-4596-a541-74e514aa25af-additional-scripts\") pod \"26a171b2-4d8f-4596-a541-74e514aa25af\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.466086 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26a171b2-4d8f-4596-a541-74e514aa25af-scripts\") pod \"26a171b2-4d8f-4596-a541-74e514aa25af\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.466106 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-log-ovn\") pod \"26a171b2-4d8f-4596-a541-74e514aa25af\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.466224 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dbd4182-75a0-42dd-97c8-a1cb8fee96f2-operator-scripts\") pod \"3dbd4182-75a0-42dd-97c8-a1cb8fee96f2\" (UID: \"3dbd4182-75a0-42dd-97c8-a1cb8fee96f2\") " Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.466276 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvqkd\" (UniqueName: \"kubernetes.io/projected/de008089-a913-4a62-85e1-f0ec597514ab-kube-api-access-dvqkd\") pod \"de008089-a913-4a62-85e1-f0ec597514ab\" (UID: \"de008089-a913-4a62-85e1-f0ec597514ab\") " Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.466350 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-run-ovn\") pod \"26a171b2-4d8f-4596-a541-74e514aa25af\" (UID: \"26a171b2-4d8f-4596-a541-74e514aa25af\") " Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.473629 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26a171b2-4d8f-4596-a541-74e514aa25af-scripts" (OuterVolumeSpecName: "scripts") pod "26a171b2-4d8f-4596-a541-74e514aa25af" (UID: "26a171b2-4d8f-4596-a541-74e514aa25af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.474390 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dbd4182-75a0-42dd-97c8-a1cb8fee96f2-kube-api-access-cnl2q" (OuterVolumeSpecName: "kube-api-access-cnl2q") pod "3dbd4182-75a0-42dd-97c8-a1cb8fee96f2" (UID: "3dbd4182-75a0-42dd-97c8-a1cb8fee96f2"). InnerVolumeSpecName "kube-api-access-cnl2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.474452 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "26a171b2-4d8f-4596-a541-74e514aa25af" (UID: "26a171b2-4d8f-4596-a541-74e514aa25af"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.474681 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "26a171b2-4d8f-4596-a541-74e514aa25af" (UID: "26a171b2-4d8f-4596-a541-74e514aa25af"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.474812 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dbd4182-75a0-42dd-97c8-a1cb8fee96f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3dbd4182-75a0-42dd-97c8-a1cb8fee96f2" (UID: "3dbd4182-75a0-42dd-97c8-a1cb8fee96f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.475638 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-run" (OuterVolumeSpecName: "var-run") pod "26a171b2-4d8f-4596-a541-74e514aa25af" (UID: "26a171b2-4d8f-4596-a541-74e514aa25af"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.475813 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26a171b2-4d8f-4596-a541-74e514aa25af-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "26a171b2-4d8f-4596-a541-74e514aa25af" (UID: "26a171b2-4d8f-4596-a541-74e514aa25af"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.475969 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de008089-a913-4a62-85e1-f0ec597514ab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de008089-a913-4a62-85e1-f0ec597514ab" (UID: "de008089-a913-4a62-85e1-f0ec597514ab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.484588 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de008089-a913-4a62-85e1-f0ec597514ab-kube-api-access-dvqkd" (OuterVolumeSpecName: "kube-api-access-dvqkd") pod "de008089-a913-4a62-85e1-f0ec597514ab" (UID: "de008089-a913-4a62-85e1-f0ec597514ab"). InnerVolumeSpecName "kube-api-access-dvqkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.495292 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26a171b2-4d8f-4596-a541-74e514aa25af-kube-api-access-mnln4" (OuterVolumeSpecName: "kube-api-access-mnln4") pod "26a171b2-4d8f-4596-a541-74e514aa25af" (UID: "26a171b2-4d8f-4596-a541-74e514aa25af"). InnerVolumeSpecName "kube-api-access-mnln4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.568056 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khkfg\" (UniqueName: \"kubernetes.io/projected/7ffd8d26-fc44-453f-ad6b-9bce2b83252e-kube-api-access-khkfg\") pod \"7ffd8d26-fc44-453f-ad6b-9bce2b83252e\" (UID: \"7ffd8d26-fc44-453f-ad6b-9bce2b83252e\") " Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.568186 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g99sf\" (UniqueName: \"kubernetes.io/projected/6677467f-5abc-4914-949d-bd6541aadeef-kube-api-access-g99sf\") pod \"6677467f-5abc-4914-949d-bd6541aadeef\" (UID: \"6677467f-5abc-4914-949d-bd6541aadeef\") " Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.568439 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6677467f-5abc-4914-949d-bd6541aadeef-operator-scripts\") pod \"6677467f-5abc-4914-949d-bd6541aadeef\" (UID: \"6677467f-5abc-4914-949d-bd6541aadeef\") " Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.568504 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ffd8d26-fc44-453f-ad6b-9bce2b83252e-operator-scripts\") pod \"7ffd8d26-fc44-453f-ad6b-9bce2b83252e\" (UID: \"7ffd8d26-fc44-453f-ad6b-9bce2b83252e\") " Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.568977 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnl2q\" (UniqueName: \"kubernetes.io/projected/3dbd4182-75a0-42dd-97c8-a1cb8fee96f2-kube-api-access-cnl2q\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.568995 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnln4\" (UniqueName: \"kubernetes.io/projected/26a171b2-4d8f-4596-a541-74e514aa25af-kube-api-access-mnln4\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.569007 4776 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-run\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.569019 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de008089-a913-4a62-85e1-f0ec597514ab-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.569027 4776 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/26a171b2-4d8f-4596-a541-74e514aa25af-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.569037 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26a171b2-4d8f-4596-a541-74e514aa25af-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.569044 4776 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.569053 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dbd4182-75a0-42dd-97c8-a1cb8fee96f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.569061 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvqkd\" (UniqueName: \"kubernetes.io/projected/de008089-a913-4a62-85e1-f0ec597514ab-kube-api-access-dvqkd\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.569069 4776 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/26a171b2-4d8f-4596-a541-74e514aa25af-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.569141 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6677467f-5abc-4914-949d-bd6541aadeef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6677467f-5abc-4914-949d-bd6541aadeef" (UID: "6677467f-5abc-4914-949d-bd6541aadeef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.570821 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ffd8d26-fc44-453f-ad6b-9bce2b83252e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ffd8d26-fc44-453f-ad6b-9bce2b83252e" (UID: "7ffd8d26-fc44-453f-ad6b-9bce2b83252e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.570938 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a752-account-create-update-c65jn" event={"ID":"de008089-a913-4a62-85e1-f0ec597514ab","Type":"ContainerDied","Data":"97fd9b78f2d6c01581bb30734f632888082b29198c56289c6fbd841a11b49eaa"} Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.570986 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97fd9b78f2d6c01581bb30734f632888082b29198c56289c6fbd841a11b49eaa" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.571054 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a752-account-create-update-c65jn" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.571343 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6677467f-5abc-4914-949d-bd6541aadeef-kube-api-access-g99sf" (OuterVolumeSpecName: "kube-api-access-g99sf") pod "6677467f-5abc-4914-949d-bd6541aadeef" (UID: "6677467f-5abc-4914-949d-bd6541aadeef"). InnerVolumeSpecName "kube-api-access-g99sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.573161 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ffd8d26-fc44-453f-ad6b-9bce2b83252e-kube-api-access-khkfg" (OuterVolumeSpecName: "kube-api-access-khkfg") pod "7ffd8d26-fc44-453f-ad6b-9bce2b83252e" (UID: "7ffd8d26-fc44-453f-ad6b-9bce2b83252e"). InnerVolumeSpecName "kube-api-access-khkfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.574600 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gjfw9" event={"ID":"a862b37a-f96b-495a-8d8e-b3640d2f0609","Type":"ContainerDied","Data":"5a7210732579613a341a247210755bae8e6f7b834ba12c99ad34f0047633a41b"} Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.574642 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a7210732579613a341a247210755bae8e6f7b834ba12c99ad34f0047633a41b" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.574616 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gjfw9" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.577853 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-wk8pc" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.577864 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-wk8pc" event={"ID":"3dbd4182-75a0-42dd-97c8-a1cb8fee96f2","Type":"ContainerDied","Data":"a3b11a17a8abcb80bb560a7a50f634b03a774ecb9f7e55a05a5e95e987304498"} Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.579010 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3b11a17a8abcb80bb560a7a50f634b03a774ecb9f7e55a05a5e95e987304498" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.579602 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-hxwvw" event={"ID":"7ffd8d26-fc44-453f-ad6b-9bce2b83252e","Type":"ContainerDied","Data":"cd745b7166e6fc9557b815b66ab3c6392ffc90146bd9157bf1b97feb47b116d6"} Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.579634 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd745b7166e6fc9557b815b66ab3c6392ffc90146bd9157bf1b97feb47b116d6" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.579658 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-hxwvw" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.581294 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f2b3-account-create-update-7jtlp" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.581301 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f2b3-account-create-update-7jtlp" event={"ID":"6677467f-5abc-4914-949d-bd6541aadeef","Type":"ContainerDied","Data":"792688683f6f7fa22b3fb361ce26d925ecdeff97ce70ba997049570f37827582"} Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.581354 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="792688683f6f7fa22b3fb361ce26d925ecdeff97ce70ba997049570f37827582" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.582696 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpgmk-config-h6865" event={"ID":"26a171b2-4d8f-4596-a541-74e514aa25af","Type":"ContainerDied","Data":"8477862713ae906d6d7c03752baccc32f4eb56635100a04512de1e126030f1af"} Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.582718 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8477862713ae906d6d7c03752baccc32f4eb56635100a04512de1e126030f1af" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.582885 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpgmk-config-h6865" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.672716 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khkfg\" (UniqueName: \"kubernetes.io/projected/7ffd8d26-fc44-453f-ad6b-9bce2b83252e-kube-api-access-khkfg\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.673109 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g99sf\" (UniqueName: \"kubernetes.io/projected/6677467f-5abc-4914-949d-bd6541aadeef-kube-api-access-g99sf\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.673124 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6677467f-5abc-4914-949d-bd6541aadeef-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.673136 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ffd8d26-fc44-453f-ad6b-9bce2b83252e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.770707 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wpgmk-config-h6865"] Dec 08 09:22:22 crc kubenswrapper[4776]: I1208 09:22:22.790806 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wpgmk-config-h6865"] Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.198330 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wxkbx" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.205606 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-732a-account-create-update-mm924" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.217242 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b17b-account-create-update-2sxng" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.293044 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd8cf615-20fc-42d9-bb77-cdeebbfcdb64-operator-scripts\") pod \"dd8cf615-20fc-42d9-bb77-cdeebbfcdb64\" (UID: \"dd8cf615-20fc-42d9-bb77-cdeebbfcdb64\") " Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.293107 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzm5m\" (UniqueName: \"kubernetes.io/projected/54af9994-75b9-457a-8b67-5687e91d698a-kube-api-access-gzm5m\") pod \"54af9994-75b9-457a-8b67-5687e91d698a\" (UID: \"54af9994-75b9-457a-8b67-5687e91d698a\") " Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.293151 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54af9994-75b9-457a-8b67-5687e91d698a-operator-scripts\") pod \"54af9994-75b9-457a-8b67-5687e91d698a\" (UID: \"54af9994-75b9-457a-8b67-5687e91d698a\") " Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.293265 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4811e2fe-5855-47ee-b742-ec6c481936a2-operator-scripts\") pod \"4811e2fe-5855-47ee-b742-ec6c481936a2\" (UID: \"4811e2fe-5855-47ee-b742-ec6c481936a2\") " Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.293324 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sxsd\" (UniqueName: \"kubernetes.io/projected/dd8cf615-20fc-42d9-bb77-cdeebbfcdb64-kube-api-access-7sxsd\") pod \"dd8cf615-20fc-42d9-bb77-cdeebbfcdb64\" (UID: \"dd8cf615-20fc-42d9-bb77-cdeebbfcdb64\") " Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.293452 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pxtn\" (UniqueName: \"kubernetes.io/projected/4811e2fe-5855-47ee-b742-ec6c481936a2-kube-api-access-5pxtn\") pod \"4811e2fe-5855-47ee-b742-ec6c481936a2\" (UID: \"4811e2fe-5855-47ee-b742-ec6c481936a2\") " Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.293948 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54af9994-75b9-457a-8b67-5687e91d698a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54af9994-75b9-457a-8b67-5687e91d698a" (UID: "54af9994-75b9-457a-8b67-5687e91d698a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.294130 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd8cf615-20fc-42d9-bb77-cdeebbfcdb64-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd8cf615-20fc-42d9-bb77-cdeebbfcdb64" (UID: "dd8cf615-20fc-42d9-bb77-cdeebbfcdb64"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.294960 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4811e2fe-5855-47ee-b742-ec6c481936a2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4811e2fe-5855-47ee-b742-ec6c481936a2" (UID: "4811e2fe-5855-47ee-b742-ec6c481936a2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.295395 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54af9994-75b9-457a-8b67-5687e91d698a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.295419 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4811e2fe-5855-47ee-b742-ec6c481936a2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.295430 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd8cf615-20fc-42d9-bb77-cdeebbfcdb64-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.298690 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd8cf615-20fc-42d9-bb77-cdeebbfcdb64-kube-api-access-7sxsd" (OuterVolumeSpecName: "kube-api-access-7sxsd") pod "dd8cf615-20fc-42d9-bb77-cdeebbfcdb64" (UID: "dd8cf615-20fc-42d9-bb77-cdeebbfcdb64"). InnerVolumeSpecName "kube-api-access-7sxsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.298726 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4811e2fe-5855-47ee-b742-ec6c481936a2-kube-api-access-5pxtn" (OuterVolumeSpecName: "kube-api-access-5pxtn") pod "4811e2fe-5855-47ee-b742-ec6c481936a2" (UID: "4811e2fe-5855-47ee-b742-ec6c481936a2"). InnerVolumeSpecName "kube-api-access-5pxtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.298744 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54af9994-75b9-457a-8b67-5687e91d698a-kube-api-access-gzm5m" (OuterVolumeSpecName: "kube-api-access-gzm5m") pod "54af9994-75b9-457a-8b67-5687e91d698a" (UID: "54af9994-75b9-457a-8b67-5687e91d698a"). InnerVolumeSpecName "kube-api-access-gzm5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.397839 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sxsd\" (UniqueName: \"kubernetes.io/projected/dd8cf615-20fc-42d9-bb77-cdeebbfcdb64-kube-api-access-7sxsd\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.397874 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pxtn\" (UniqueName: \"kubernetes.io/projected/4811e2fe-5855-47ee-b742-ec6c481936a2-kube-api-access-5pxtn\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.397883 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzm5m\" (UniqueName: \"kubernetes.io/projected/54af9994-75b9-457a-8b67-5687e91d698a-kube-api-access-gzm5m\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.597323 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"95be142a-2a8f-4f5c-97e0-2e64e108fb8b","Type":"ContainerStarted","Data":"5621cdd0772f1cbc946840a3ec6f21098b6bb05a1d67ef93eebe64b8846a5c24"} Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.598850 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-732a-account-create-update-mm924" event={"ID":"54af9994-75b9-457a-8b67-5687e91d698a","Type":"ContainerDied","Data":"6a98c57e58170fe58d003496e7c6e8520a655de7642714273b485ac6a1fc323e"} Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.598876 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-732a-account-create-update-mm924" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.598886 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a98c57e58170fe58d003496e7c6e8520a655de7642714273b485ac6a1fc323e" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.601195 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wxkbx" event={"ID":"dd8cf615-20fc-42d9-bb77-cdeebbfcdb64","Type":"ContainerDied","Data":"9d2f63d7015922c540c5c059f3dd3be83388a6acd3f81749036a0fe11f4030cc"} Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.601222 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d2f63d7015922c540c5c059f3dd3be83388a6acd3f81749036a0fe11f4030cc" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.601268 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wxkbx" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.609347 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b17b-account-create-update-2sxng" event={"ID":"4811e2fe-5855-47ee-b742-ec6c481936a2","Type":"ContainerDied","Data":"2d235beab8073fce81e692c23a21de61f194da25f7f1a50587470046bb56d48d"} Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.609589 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d235beab8073fce81e692c23a21de61f194da25f7f1a50587470046bb56d48d" Dec 08 09:22:23 crc kubenswrapper[4776]: I1208 09:22:23.609658 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b17b-account-create-update-2sxng" Dec 08 09:22:24 crc kubenswrapper[4776]: I1208 09:22:24.359736 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26a171b2-4d8f-4596-a541-74e514aa25af" path="/var/lib/kubelet/pods/26a171b2-4d8f-4596-a541-74e514aa25af/volumes" Dec 08 09:22:25 crc kubenswrapper[4776]: I1208 09:22:25.169509 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-wpgmk" Dec 08 09:22:26 crc kubenswrapper[4776]: I1208 09:22:26.898506 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:26 crc kubenswrapper[4776]: E1208 09:22:26.898783 4776 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 08 09:22:26 crc kubenswrapper[4776]: E1208 09:22:26.898803 4776 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 08 09:22:26 crc kubenswrapper[4776]: E1208 09:22:26.898850 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift podName:cb640491-a8e7-4f8d-b4bb-1d0124f5727f nodeName:}" failed. No retries permitted until 2025-12-08 09:22:42.898833253 +0000 UTC m=+1439.162058275 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift") pod "swift-storage-0" (UID: "cb640491-a8e7-4f8d-b4bb-1d0124f5727f") : configmap "swift-ring-files" not found Dec 08 09:22:27 crc kubenswrapper[4776]: I1208 09:22:27.670209 4776 generic.go:334] "Generic (PLEG): container finished" podID="0436afba-d4b2-47d8-ac4d-c621e029333d" containerID="560cd3c0ea1c2892fc88c94dc8d330a36bceef5699defa1744ac58c98215b725" exitCode=0 Dec 08 09:22:27 crc kubenswrapper[4776]: I1208 09:22:27.670306 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mmp8z" event={"ID":"0436afba-d4b2-47d8-ac4d-c621e029333d","Type":"ContainerDied","Data":"560cd3c0ea1c2892fc88c94dc8d330a36bceef5699defa1744ac58c98215b725"} Dec 08 09:22:27 crc kubenswrapper[4776]: I1208 09:22:27.682921 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tz8ks" event={"ID":"1d1decbe-db5e-4910-9604-aca62ec47099","Type":"ContainerStarted","Data":"945643d714eb1ac8ac03804db9a4c0f27171a18f9c47fd32e202391e0f5fee43"} Dec 08 09:22:27 crc kubenswrapper[4776]: I1208 09:22:27.721649 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-tz8ks" podStartSLOduration=4.004949683 podStartE2EDuration="10.72163225s" podCreationTimestamp="2025-12-08 09:22:17 +0000 UTC" firstStartedPulling="2025-12-08 09:22:20.155146752 +0000 UTC m=+1416.418371774" lastFinishedPulling="2025-12-08 09:22:26.871829319 +0000 UTC m=+1423.135054341" observedRunningTime="2025-12-08 09:22:27.713217694 +0000 UTC m=+1423.976442716" watchObservedRunningTime="2025-12-08 09:22:27.72163225 +0000 UTC m=+1423.984857272" Dec 08 09:22:28 crc kubenswrapper[4776]: I1208 09:22:28.702596 4776 generic.go:334] "Generic (PLEG): container finished" podID="95be142a-2a8f-4f5c-97e0-2e64e108fb8b" containerID="5621cdd0772f1cbc946840a3ec6f21098b6bb05a1d67ef93eebe64b8846a5c24" exitCode=0 Dec 08 09:22:28 crc kubenswrapper[4776]: I1208 09:22:28.702681 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"95be142a-2a8f-4f5c-97e0-2e64e108fb8b","Type":"ContainerDied","Data":"5621cdd0772f1cbc946840a3ec6f21098b6bb05a1d67ef93eebe64b8846a5c24"} Dec 08 09:22:30 crc kubenswrapper[4776]: E1208 09:22:30.444733 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d1decbe_db5e_4910_9604_aca62ec47099.slice/crio-conmon-945643d714eb1ac8ac03804db9a4c0f27171a18f9c47fd32e202391e0f5fee43.scope\": RecentStats: unable to find data in memory cache]" Dec 08 09:22:30 crc kubenswrapper[4776]: I1208 09:22:30.725718 4776 generic.go:334] "Generic (PLEG): container finished" podID="1d1decbe-db5e-4910-9604-aca62ec47099" containerID="945643d714eb1ac8ac03804db9a4c0f27171a18f9c47fd32e202391e0f5fee43" exitCode=0 Dec 08 09:22:30 crc kubenswrapper[4776]: I1208 09:22:30.725769 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tz8ks" event={"ID":"1d1decbe-db5e-4910-9604-aca62ec47099","Type":"ContainerDied","Data":"945643d714eb1ac8ac03804db9a4c0f27171a18f9c47fd32e202391e0f5fee43"} Dec 08 09:22:35 crc kubenswrapper[4776]: I1208 09:22:35.800032 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mmp8z" event={"ID":"0436afba-d4b2-47d8-ac4d-c621e029333d","Type":"ContainerDied","Data":"589cb69edd0cc7bd908a1088128edb1e7acd73e7f75a8f700aa5d167d6ff36fb"} Dec 08 09:22:35 crc kubenswrapper[4776]: I1208 09:22:35.800520 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="589cb69edd0cc7bd908a1088128edb1e7acd73e7f75a8f700aa5d167d6ff36fb" Dec 08 09:22:35 crc kubenswrapper[4776]: I1208 09:22:35.804250 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tz8ks" event={"ID":"1d1decbe-db5e-4910-9604-aca62ec47099","Type":"ContainerDied","Data":"6cc128b64f7b80a13f658b124395a4491001528aa60f0bd48e33ca3d824e36b3"} Dec 08 09:22:35 crc kubenswrapper[4776]: I1208 09:22:35.804294 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cc128b64f7b80a13f658b124395a4491001528aa60f0bd48e33ca3d824e36b3" Dec 08 09:22:35 crc kubenswrapper[4776]: I1208 09:22:35.921363 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:35 crc kubenswrapper[4776]: I1208 09:22:35.938440 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tz8ks" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.050219 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-dispersionconf\") pod \"0436afba-d4b2-47d8-ac4d-c621e029333d\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.050272 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0436afba-d4b2-47d8-ac4d-c621e029333d-scripts\") pod \"0436afba-d4b2-47d8-ac4d-c621e029333d\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.050320 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1decbe-db5e-4910-9604-aca62ec47099-combined-ca-bundle\") pod \"1d1decbe-db5e-4910-9604-aca62ec47099\" (UID: \"1d1decbe-db5e-4910-9604-aca62ec47099\") " Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.050386 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-swiftconf\") pod \"0436afba-d4b2-47d8-ac4d-c621e029333d\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.051040 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0436afba-d4b2-47d8-ac4d-c621e029333d-ring-data-devices\") pod \"0436afba-d4b2-47d8-ac4d-c621e029333d\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.051093 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1decbe-db5e-4910-9604-aca62ec47099-config-data\") pod \"1d1decbe-db5e-4910-9604-aca62ec47099\" (UID: \"1d1decbe-db5e-4910-9604-aca62ec47099\") " Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.051123 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqmh7\" (UniqueName: \"kubernetes.io/projected/0436afba-d4b2-47d8-ac4d-c621e029333d-kube-api-access-qqmh7\") pod \"0436afba-d4b2-47d8-ac4d-c621e029333d\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.051211 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0436afba-d4b2-47d8-ac4d-c621e029333d-etc-swift\") pod \"0436afba-d4b2-47d8-ac4d-c621e029333d\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.051249 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-combined-ca-bundle\") pod \"0436afba-d4b2-47d8-ac4d-c621e029333d\" (UID: \"0436afba-d4b2-47d8-ac4d-c621e029333d\") " Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.051357 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d68gm\" (UniqueName: \"kubernetes.io/projected/1d1decbe-db5e-4910-9604-aca62ec47099-kube-api-access-d68gm\") pod \"1d1decbe-db5e-4910-9604-aca62ec47099\" (UID: \"1d1decbe-db5e-4910-9604-aca62ec47099\") " Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.052232 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0436afba-d4b2-47d8-ac4d-c621e029333d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0436afba-d4b2-47d8-ac4d-c621e029333d" (UID: "0436afba-d4b2-47d8-ac4d-c621e029333d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.052725 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0436afba-d4b2-47d8-ac4d-c621e029333d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0436afba-d4b2-47d8-ac4d-c621e029333d" (UID: "0436afba-d4b2-47d8-ac4d-c621e029333d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.055884 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d1decbe-db5e-4910-9604-aca62ec47099-kube-api-access-d68gm" (OuterVolumeSpecName: "kube-api-access-d68gm") pod "1d1decbe-db5e-4910-9604-aca62ec47099" (UID: "1d1decbe-db5e-4910-9604-aca62ec47099"). InnerVolumeSpecName "kube-api-access-d68gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.056004 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0436afba-d4b2-47d8-ac4d-c621e029333d-kube-api-access-qqmh7" (OuterVolumeSpecName: "kube-api-access-qqmh7") pod "0436afba-d4b2-47d8-ac4d-c621e029333d" (UID: "0436afba-d4b2-47d8-ac4d-c621e029333d"). InnerVolumeSpecName "kube-api-access-qqmh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.060041 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0436afba-d4b2-47d8-ac4d-c621e029333d" (UID: "0436afba-d4b2-47d8-ac4d-c621e029333d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.083563 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0436afba-d4b2-47d8-ac4d-c621e029333d-scripts" (OuterVolumeSpecName: "scripts") pod "0436afba-d4b2-47d8-ac4d-c621e029333d" (UID: "0436afba-d4b2-47d8-ac4d-c621e029333d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.085367 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0436afba-d4b2-47d8-ac4d-c621e029333d" (UID: "0436afba-d4b2-47d8-ac4d-c621e029333d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.101401 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1decbe-db5e-4910-9604-aca62ec47099-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d1decbe-db5e-4910-9604-aca62ec47099" (UID: "1d1decbe-db5e-4910-9604-aca62ec47099"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.122498 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0436afba-d4b2-47d8-ac4d-c621e029333d" (UID: "0436afba-d4b2-47d8-ac4d-c621e029333d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.134469 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1decbe-db5e-4910-9604-aca62ec47099-config-data" (OuterVolumeSpecName: "config-data") pod "1d1decbe-db5e-4910-9604-aca62ec47099" (UID: "1d1decbe-db5e-4910-9604-aca62ec47099"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.153748 4776 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0436afba-d4b2-47d8-ac4d-c621e029333d-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.153801 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.153821 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d68gm\" (UniqueName: \"kubernetes.io/projected/1d1decbe-db5e-4910-9604-aca62ec47099-kube-api-access-d68gm\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.153838 4776 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.153855 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0436afba-d4b2-47d8-ac4d-c621e029333d-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.153873 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1decbe-db5e-4910-9604-aca62ec47099-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.153889 4776 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0436afba-d4b2-47d8-ac4d-c621e029333d-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.153922 4776 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0436afba-d4b2-47d8-ac4d-c621e029333d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.153939 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1decbe-db5e-4910-9604-aca62ec47099-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.153957 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqmh7\" (UniqueName: \"kubernetes.io/projected/0436afba-d4b2-47d8-ac4d-c621e029333d-kube-api-access-qqmh7\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.815280 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"95be142a-2a8f-4f5c-97e0-2e64e108fb8b","Type":"ContainerStarted","Data":"c08c115a57c09e1598489f2bd99ba740ad261b3c71a1f01401ac59b28980cc0c"} Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.817426 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tz8ks" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.817449 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mmp8z" Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.817409 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9hw46" event={"ID":"4a920788-f8d6-4c42-84f6-d842d9bf9a17","Type":"ContainerStarted","Data":"4f24f8968ed17fa81f6f8869195d3cc6621d449566b91c6fe0cb95b1375dcc9d"} Dec 08 09:22:36 crc kubenswrapper[4776]: I1208 09:22:36.833124 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-9hw46" podStartSLOduration=3.612310861 podStartE2EDuration="18.833102322s" podCreationTimestamp="2025-12-08 09:22:18 +0000 UTC" firstStartedPulling="2025-12-08 09:22:20.556878112 +0000 UTC m=+1416.820103134" lastFinishedPulling="2025-12-08 09:22:35.777669573 +0000 UTC m=+1432.040894595" observedRunningTime="2025-12-08 09:22:36.832072434 +0000 UTC m=+1433.095297486" watchObservedRunningTime="2025-12-08 09:22:36.833102322 +0000 UTC m=+1433.096327344" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.230744 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-7nf99"] Dec 08 09:22:37 crc kubenswrapper[4776]: E1208 09:22:37.231237 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd497a67-79f7-4a1a-b0de-eb6fdcc524ae" containerName="init" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231259 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd497a67-79f7-4a1a-b0de-eb6fdcc524ae" containerName="init" Dec 08 09:22:37 crc kubenswrapper[4776]: E1208 09:22:37.231273 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6677467f-5abc-4914-949d-bd6541aadeef" containerName="mariadb-account-create-update" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231283 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6677467f-5abc-4914-949d-bd6541aadeef" containerName="mariadb-account-create-update" Dec 08 09:22:37 crc kubenswrapper[4776]: E1208 09:22:37.231296 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4811e2fe-5855-47ee-b742-ec6c481936a2" containerName="mariadb-account-create-update" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231304 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4811e2fe-5855-47ee-b742-ec6c481936a2" containerName="mariadb-account-create-update" Dec 08 09:22:37 crc kubenswrapper[4776]: E1208 09:22:37.231317 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d1decbe-db5e-4910-9604-aca62ec47099" containerName="keystone-db-sync" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231326 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d1decbe-db5e-4910-9604-aca62ec47099" containerName="keystone-db-sync" Dec 08 09:22:37 crc kubenswrapper[4776]: E1208 09:22:37.231336 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8cf615-20fc-42d9-bb77-cdeebbfcdb64" containerName="mariadb-database-create" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231344 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8cf615-20fc-42d9-bb77-cdeebbfcdb64" containerName="mariadb-database-create" Dec 08 09:22:37 crc kubenswrapper[4776]: E1208 09:22:37.231357 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a171b2-4d8f-4596-a541-74e514aa25af" containerName="ovn-config" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231364 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a171b2-4d8f-4596-a541-74e514aa25af" containerName="ovn-config" Dec 08 09:22:37 crc kubenswrapper[4776]: E1208 09:22:37.231378 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de008089-a913-4a62-85e1-f0ec597514ab" containerName="mariadb-account-create-update" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231386 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="de008089-a913-4a62-85e1-f0ec597514ab" containerName="mariadb-account-create-update" Dec 08 09:22:37 crc kubenswrapper[4776]: E1208 09:22:37.231400 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0436afba-d4b2-47d8-ac4d-c621e029333d" containerName="swift-ring-rebalance" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231408 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0436afba-d4b2-47d8-ac4d-c621e029333d" containerName="swift-ring-rebalance" Dec 08 09:22:37 crc kubenswrapper[4776]: E1208 09:22:37.231421 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ffd8d26-fc44-453f-ad6b-9bce2b83252e" containerName="mariadb-database-create" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231428 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ffd8d26-fc44-453f-ad6b-9bce2b83252e" containerName="mariadb-database-create" Dec 08 09:22:37 crc kubenswrapper[4776]: E1208 09:22:37.231441 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54af9994-75b9-457a-8b67-5687e91d698a" containerName="mariadb-account-create-update" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231449 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="54af9994-75b9-457a-8b67-5687e91d698a" containerName="mariadb-account-create-update" Dec 08 09:22:37 crc kubenswrapper[4776]: E1208 09:22:37.231472 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dbd4182-75a0-42dd-97c8-a1cb8fee96f2" containerName="mariadb-database-create" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231479 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dbd4182-75a0-42dd-97c8-a1cb8fee96f2" containerName="mariadb-database-create" Dec 08 09:22:37 crc kubenswrapper[4776]: E1208 09:22:37.231490 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd497a67-79f7-4a1a-b0de-eb6fdcc524ae" containerName="dnsmasq-dns" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231498 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd497a67-79f7-4a1a-b0de-eb6fdcc524ae" containerName="dnsmasq-dns" Dec 08 09:22:37 crc kubenswrapper[4776]: E1208 09:22:37.231509 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a862b37a-f96b-495a-8d8e-b3640d2f0609" containerName="mariadb-database-create" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231516 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a862b37a-f96b-495a-8d8e-b3640d2f0609" containerName="mariadb-database-create" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231752 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="de008089-a913-4a62-85e1-f0ec597514ab" containerName="mariadb-account-create-update" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231778 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ffd8d26-fc44-453f-ad6b-9bce2b83252e" containerName="mariadb-database-create" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231789 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8cf615-20fc-42d9-bb77-cdeebbfcdb64" containerName="mariadb-database-create" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231807 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="54af9994-75b9-457a-8b67-5687e91d698a" containerName="mariadb-account-create-update" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231819 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a862b37a-f96b-495a-8d8e-b3640d2f0609" containerName="mariadb-database-create" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231833 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d1decbe-db5e-4910-9604-aca62ec47099" containerName="keystone-db-sync" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231844 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4811e2fe-5855-47ee-b742-ec6c481936a2" containerName="mariadb-account-create-update" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231856 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0436afba-d4b2-47d8-ac4d-c621e029333d" containerName="swift-ring-rebalance" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231868 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dbd4182-75a0-42dd-97c8-a1cb8fee96f2" containerName="mariadb-database-create" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231883 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6677467f-5abc-4914-949d-bd6541aadeef" containerName="mariadb-account-create-update" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231895 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="26a171b2-4d8f-4596-a541-74e514aa25af" containerName="ovn-config" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.231913 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd497a67-79f7-4a1a-b0de-eb6fdcc524ae" containerName="dnsmasq-dns" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.233241 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.256854 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-7nf99"] Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.303784 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-v4nch"] Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.314960 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.321221 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.321486 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.327763 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.329864 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.330266 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wz4bj" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.393305 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-v4nch"] Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.445365 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-zhhw6"] Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.446889 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zhhw6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.455938 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-srzvg" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.459666 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.477785 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kck9r\" (UniqueName: \"kubernetes.io/projected/44d849dd-5372-4dd4-a691-036dbb925fcd-kube-api-access-kck9r\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.477844 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-combined-ca-bundle\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.477949 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-config\") pod \"dnsmasq-dns-f877ddd87-7nf99\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.478098 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-fernet-keys\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.478129 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-credential-keys\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.478150 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-7nf99\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.478291 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-7nf99\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.478315 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-scripts\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.478334 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v78d\" (UniqueName: \"kubernetes.io/projected/eb67a5c8-c43c-467e-963e-85f3789ca32a-kube-api-access-9v78d\") pod \"dnsmasq-dns-f877ddd87-7nf99\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.478401 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-config-data\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.478423 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-dns-svc\") pod \"dnsmasq-dns-f877ddd87-7nf99\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.493013 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-zhhw6"] Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.505203 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-xksjb"] Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.506917 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.519547 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fvv68" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.519672 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.520241 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.565254 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xksjb"] Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.579892 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-credential-keys\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.579933 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-7nf99\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.579973 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d7c64ff-eec0-48d3-bba8-724158787096-etc-machine-id\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580003 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-7nf99\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580022 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-scripts\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580043 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v78d\" (UniqueName: \"kubernetes.io/projected/eb67a5c8-c43c-467e-963e-85f3789ca32a-kube-api-access-9v78d\") pod \"dnsmasq-dns-f877ddd87-7nf99\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580067 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8z6p\" (UniqueName: \"kubernetes.io/projected/6d7c64ff-eec0-48d3-bba8-724158787096-kube-api-access-x8z6p\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580096 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-db-sync-config-data\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580142 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-config-data\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580158 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-config-data\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580189 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-dns-svc\") pod \"dnsmasq-dns-f877ddd87-7nf99\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580224 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kck9r\" (UniqueName: \"kubernetes.io/projected/44d849dd-5372-4dd4-a691-036dbb925fcd-kube-api-access-kck9r\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580248 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-combined-ca-bundle\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580267 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-config-data\") pod \"heat-db-sync-zhhw6\" (UID: \"f4a95eb4-92c1-4eff-940b-37f74dd3dc18\") " pod="openstack/heat-db-sync-zhhw6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580291 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-combined-ca-bundle\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580309 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txtz4\" (UniqueName: \"kubernetes.io/projected/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-kube-api-access-txtz4\") pod \"heat-db-sync-zhhw6\" (UID: \"f4a95eb4-92c1-4eff-940b-37f74dd3dc18\") " pod="openstack/heat-db-sync-zhhw6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580325 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-config\") pod \"dnsmasq-dns-f877ddd87-7nf99\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580347 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-combined-ca-bundle\") pod \"heat-db-sync-zhhw6\" (UID: \"f4a95eb4-92c1-4eff-940b-37f74dd3dc18\") " pod="openstack/heat-db-sync-zhhw6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580408 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-scripts\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.580427 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-fernet-keys\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.583904 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-7nf99\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.584244 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-config\") pod \"dnsmasq-dns-f877ddd87-7nf99\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.584561 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-7nf99\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.585618 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-dns-svc\") pod \"dnsmasq-dns-f877ddd87-7nf99\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.590302 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-scripts\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.590755 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-combined-ca-bundle\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.595614 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-credential-keys\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.596899 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-config-data\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.620730 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kck9r\" (UniqueName: \"kubernetes.io/projected/44d849dd-5372-4dd4-a691-036dbb925fcd-kube-api-access-kck9r\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.622462 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-8tjn6"] Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.625749 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-fernet-keys\") pod \"keystone-bootstrap-v4nch\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.626537 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8tjn6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.633693 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.633955 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.634122 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v5jm7" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.664604 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.664904 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8tjn6"] Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.676060 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v78d\" (UniqueName: \"kubernetes.io/projected/eb67a5c8-c43c-467e-963e-85f3789ca32a-kube-api-access-9v78d\") pod \"dnsmasq-dns-f877ddd87-7nf99\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.686240 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-db-sync-config-data\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.686556 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-config-data\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.686657 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-combined-ca-bundle\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.686743 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-config-data\") pod \"heat-db-sync-zhhw6\" (UID: \"f4a95eb4-92c1-4eff-940b-37f74dd3dc18\") " pod="openstack/heat-db-sync-zhhw6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.686818 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txtz4\" (UniqueName: \"kubernetes.io/projected/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-kube-api-access-txtz4\") pod \"heat-db-sync-zhhw6\" (UID: \"f4a95eb4-92c1-4eff-940b-37f74dd3dc18\") " pod="openstack/heat-db-sync-zhhw6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.686889 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-combined-ca-bundle\") pod \"heat-db-sync-zhhw6\" (UID: \"f4a95eb4-92c1-4eff-940b-37f74dd3dc18\") " pod="openstack/heat-db-sync-zhhw6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.686993 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-scripts\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.687084 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d7c64ff-eec0-48d3-bba8-724158787096-etc-machine-id\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.687226 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8z6p\" (UniqueName: \"kubernetes.io/projected/6d7c64ff-eec0-48d3-bba8-724158787096-kube-api-access-x8z6p\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.700675 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-db-sync-config-data\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.705143 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d7c64ff-eec0-48d3-bba8-724158787096-etc-machine-id\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.721876 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-scripts\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.735598 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-config-data\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.735828 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-626nj"] Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.737640 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-626nj" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.738924 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-combined-ca-bundle\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.747468 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.749480 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tjbl6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.750425 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txtz4\" (UniqueName: \"kubernetes.io/projected/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-kube-api-access-txtz4\") pod \"heat-db-sync-zhhw6\" (UID: \"f4a95eb4-92c1-4eff-940b-37f74dd3dc18\") " pod="openstack/heat-db-sync-zhhw6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.750755 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-combined-ca-bundle\") pod \"heat-db-sync-zhhw6\" (UID: \"f4a95eb4-92c1-4eff-940b-37f74dd3dc18\") " pod="openstack/heat-db-sync-zhhw6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.755725 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8z6p\" (UniqueName: \"kubernetes.io/projected/6d7c64ff-eec0-48d3-bba8-724158787096-kube-api-access-x8z6p\") pod \"cinder-db-sync-xksjb\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.756218 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-config-data\") pod \"heat-db-sync-zhhw6\" (UID: \"f4a95eb4-92c1-4eff-940b-37f74dd3dc18\") " pod="openstack/heat-db-sync-zhhw6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.772866 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zhhw6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.793526 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9d1e39a-4040-4f14-819f-f41e85a35143-combined-ca-bundle\") pod \"neutron-db-sync-8tjn6\" (UID: \"f9d1e39a-4040-4f14-819f-f41e85a35143\") " pod="openstack/neutron-db-sync-8tjn6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.793597 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48vll\" (UniqueName: \"kubernetes.io/projected/f9d1e39a-4040-4f14-819f-f41e85a35143-kube-api-access-48vll\") pod \"neutron-db-sync-8tjn6\" (UID: \"f9d1e39a-4040-4f14-819f-f41e85a35143\") " pod="openstack/neutron-db-sync-8tjn6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.793662 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9d1e39a-4040-4f14-819f-f41e85a35143-config\") pod \"neutron-db-sync-8tjn6\" (UID: \"f9d1e39a-4040-4f14-819f-f41e85a35143\") " pod="openstack/neutron-db-sync-8tjn6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.831940 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2s7n9"] Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.842147 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.848840 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-64vhk" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.849028 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.849428 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xksjb" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.855990 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.856583 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.866233 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2s7n9"] Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.909784 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9d1e39a-4040-4f14-819f-f41e85a35143-config\") pod \"neutron-db-sync-8tjn6\" (UID: \"f9d1e39a-4040-4f14-819f-f41e85a35143\") " pod="openstack/neutron-db-sync-8tjn6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.909872 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wpcz\" (UniqueName: \"kubernetes.io/projected/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-kube-api-access-2wpcz\") pod \"barbican-db-sync-626nj\" (UID: \"7c962dc3-3c64-4b5d-a740-a790a5fa10f9\") " pod="openstack/barbican-db-sync-626nj" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.909942 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-combined-ca-bundle\") pod \"barbican-db-sync-626nj\" (UID: \"7c962dc3-3c64-4b5d-a740-a790a5fa10f9\") " pod="openstack/barbican-db-sync-626nj" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.909982 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-db-sync-config-data\") pod \"barbican-db-sync-626nj\" (UID: \"7c962dc3-3c64-4b5d-a740-a790a5fa10f9\") " pod="openstack/barbican-db-sync-626nj" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.910315 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9d1e39a-4040-4f14-819f-f41e85a35143-combined-ca-bundle\") pod \"neutron-db-sync-8tjn6\" (UID: \"f9d1e39a-4040-4f14-819f-f41e85a35143\") " pod="openstack/neutron-db-sync-8tjn6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.910422 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48vll\" (UniqueName: \"kubernetes.io/projected/f9d1e39a-4040-4f14-819f-f41e85a35143-kube-api-access-48vll\") pod \"neutron-db-sync-8tjn6\" (UID: \"f9d1e39a-4040-4f14-819f-f41e85a35143\") " pod="openstack/neutron-db-sync-8tjn6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.927371 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9d1e39a-4040-4f14-819f-f41e85a35143-config\") pod \"neutron-db-sync-8tjn6\" (UID: \"f9d1e39a-4040-4f14-819f-f41e85a35143\") " pod="openstack/neutron-db-sync-8tjn6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.943457 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-626nj"] Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.955798 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9d1e39a-4040-4f14-819f-f41e85a35143-combined-ca-bundle\") pod \"neutron-db-sync-8tjn6\" (UID: \"f9d1e39a-4040-4f14-819f-f41e85a35143\") " pod="openstack/neutron-db-sync-8tjn6" Dec 08 09:22:37 crc kubenswrapper[4776]: I1208 09:22:37.983238 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-7nf99"] Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.012542 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-db-sync-config-data\") pod \"barbican-db-sync-626nj\" (UID: \"7c962dc3-3c64-4b5d-a740-a790a5fa10f9\") " pod="openstack/barbican-db-sync-626nj" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.012595 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dff1e28-5d80-48af-b348-cfd6080d3e37-logs\") pod \"placement-db-sync-2s7n9\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.012628 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwnsq\" (UniqueName: \"kubernetes.io/projected/9dff1e28-5d80-48af-b348-cfd6080d3e37-kube-api-access-vwnsq\") pod \"placement-db-sync-2s7n9\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.012651 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-scripts\") pod \"placement-db-sync-2s7n9\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.012718 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-combined-ca-bundle\") pod \"placement-db-sync-2s7n9\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.012789 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-config-data\") pod \"placement-db-sync-2s7n9\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.012879 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wpcz\" (UniqueName: \"kubernetes.io/projected/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-kube-api-access-2wpcz\") pod \"barbican-db-sync-626nj\" (UID: \"7c962dc3-3c64-4b5d-a740-a790a5fa10f9\") " pod="openstack/barbican-db-sync-626nj" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.012919 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-combined-ca-bundle\") pod \"barbican-db-sync-626nj\" (UID: \"7c962dc3-3c64-4b5d-a740-a790a5fa10f9\") " pod="openstack/barbican-db-sync-626nj" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.013945 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-4zwkd"] Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.015720 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.021306 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-db-sync-config-data\") pod \"barbican-db-sync-626nj\" (UID: \"7c962dc3-3c64-4b5d-a740-a790a5fa10f9\") " pod="openstack/barbican-db-sync-626nj" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.021978 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-combined-ca-bundle\") pod \"barbican-db-sync-626nj\" (UID: \"7c962dc3-3c64-4b5d-a740-a790a5fa10f9\") " pod="openstack/barbican-db-sync-626nj" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.042622 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wpcz\" (UniqueName: \"kubernetes.io/projected/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-kube-api-access-2wpcz\") pod \"barbican-db-sync-626nj\" (UID: \"7c962dc3-3c64-4b5d-a740-a790a5fa10f9\") " pod="openstack/barbican-db-sync-626nj" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.056233 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-4zwkd"] Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.063684 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48vll\" (UniqueName: \"kubernetes.io/projected/f9d1e39a-4040-4f14-819f-f41e85a35143-kube-api-access-48vll\") pod \"neutron-db-sync-8tjn6\" (UID: \"f9d1e39a-4040-4f14-819f-f41e85a35143\") " pod="openstack/neutron-db-sync-8tjn6" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.116381 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-4zwkd\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.116476 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-config\") pod \"dnsmasq-dns-68dcc9cf6f-4zwkd\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.116508 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dff1e28-5d80-48af-b348-cfd6080d3e37-logs\") pod \"placement-db-sync-2s7n9\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.116531 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-4zwkd\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.116550 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsjk6\" (UniqueName: \"kubernetes.io/projected/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-kube-api-access-lsjk6\") pod \"dnsmasq-dns-68dcc9cf6f-4zwkd\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.116570 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwnsq\" (UniqueName: \"kubernetes.io/projected/9dff1e28-5d80-48af-b348-cfd6080d3e37-kube-api-access-vwnsq\") pod \"placement-db-sync-2s7n9\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.116588 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-scripts\") pod \"placement-db-sync-2s7n9\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.116628 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-4zwkd\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.116669 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-combined-ca-bundle\") pod \"placement-db-sync-2s7n9\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.116721 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-config-data\") pod \"placement-db-sync-2s7n9\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.117706 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dff1e28-5d80-48af-b348-cfd6080d3e37-logs\") pod \"placement-db-sync-2s7n9\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.120649 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-scripts\") pod \"placement-db-sync-2s7n9\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.127546 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-combined-ca-bundle\") pod \"placement-db-sync-2s7n9\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.130785 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-config-data\") pod \"placement-db-sync-2s7n9\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.131495 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.137010 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwnsq\" (UniqueName: \"kubernetes.io/projected/9dff1e28-5d80-48af-b348-cfd6080d3e37-kube-api-access-vwnsq\") pod \"placement-db-sync-2s7n9\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.138746 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.140817 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.142532 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.147428 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.220263 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-4zwkd\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.220425 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-config\") pod \"dnsmasq-dns-68dcc9cf6f-4zwkd\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.220496 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-4zwkd\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.220516 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsjk6\" (UniqueName: \"kubernetes.io/projected/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-kube-api-access-lsjk6\") pod \"dnsmasq-dns-68dcc9cf6f-4zwkd\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.220598 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-4zwkd\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.221924 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-config\") pod \"dnsmasq-dns-68dcc9cf6f-4zwkd\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.222147 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-4zwkd\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.222262 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-4zwkd\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.222456 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-4zwkd\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.236595 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsjk6\" (UniqueName: \"kubernetes.io/projected/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-kube-api-access-lsjk6\") pod \"dnsmasq-dns-68dcc9cf6f-4zwkd\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.322359 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-config-data\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.322432 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.322455 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnvjn\" (UniqueName: \"kubernetes.io/projected/fab03865-02a6-4cd2-bf78-22ed25534301-kube-api-access-wnvjn\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.322493 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fab03865-02a6-4cd2-bf78-22ed25534301-log-httpd\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.322509 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fab03865-02a6-4cd2-bf78-22ed25534301-run-httpd\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.322545 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-scripts\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.322571 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.328958 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8tjn6" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.344299 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-626nj" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.355783 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2s7n9" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.385285 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.431930 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-config-data\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.433090 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.433129 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnvjn\" (UniqueName: \"kubernetes.io/projected/fab03865-02a6-4cd2-bf78-22ed25534301-kube-api-access-wnvjn\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.433224 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fab03865-02a6-4cd2-bf78-22ed25534301-log-httpd\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.435533 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fab03865-02a6-4cd2-bf78-22ed25534301-run-httpd\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.433684 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fab03865-02a6-4cd2-bf78-22ed25534301-log-httpd\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.436191 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fab03865-02a6-4cd2-bf78-22ed25534301-run-httpd\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.436394 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-scripts\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.436466 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.437364 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-config-data\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.438297 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.439057 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.449041 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-v4nch"] Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.459008 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-scripts\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.459030 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnvjn\" (UniqueName: \"kubernetes.io/projected/fab03865-02a6-4cd2-bf78-22ed25534301-kube-api-access-wnvjn\") pod \"ceilometer-0\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.600302 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-zhhw6"] Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.695633 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.745216 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-7nf99"] Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.756728 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xksjb"] Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.962043 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-7nf99" event={"ID":"eb67a5c8-c43c-467e-963e-85f3789ca32a","Type":"ContainerStarted","Data":"950d941134697f1c582ce491613070300a9901cd28bb933008be0d2eaa3ac568"} Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.964765 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xksjb" event={"ID":"6d7c64ff-eec0-48d3-bba8-724158787096","Type":"ContainerStarted","Data":"45e9d04376a91e26f6d2b35bbc0678ed858e8ddf53e97216124bc6349d102798"} Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.970987 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v4nch" event={"ID":"44d849dd-5372-4dd4-a691-036dbb925fcd","Type":"ContainerStarted","Data":"93494ecd364c05c66216b5c2298f9028833cfe07f08c157b9c0ea2623d43351d"} Dec 08 09:22:38 crc kubenswrapper[4776]: I1208 09:22:38.978026 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zhhw6" event={"ID":"f4a95eb4-92c1-4eff-940b-37f74dd3dc18","Type":"ContainerStarted","Data":"fa17e8c3df5cbab401a0e22eb3a2961ccec569ee0742f4db79265ae8496978a0"} Dec 08 09:22:39 crc kubenswrapper[4776]: I1208 09:22:39.059960 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-4zwkd"] Dec 08 09:22:39 crc kubenswrapper[4776]: I1208 09:22:39.255234 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2s7n9"] Dec 08 09:22:39 crc kubenswrapper[4776]: I1208 09:22:39.289991 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-626nj"] Dec 08 09:22:39 crc kubenswrapper[4776]: I1208 09:22:39.309472 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8tjn6"] Dec 08 09:22:39 crc kubenswrapper[4776]: I1208 09:22:39.531595 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:22:39 crc kubenswrapper[4776]: W1208 09:22:39.541302 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfab03865_02a6_4cd2_bf78_22ed25534301.slice/crio-49d1730bcbf42067e54950304c081b37ef66fc485287100a48ff4fb4b23e7e86 WatchSource:0}: Error finding container 49d1730bcbf42067e54950304c081b37ef66fc485287100a48ff4fb4b23e7e86: Status 404 returned error can't find the container with id 49d1730bcbf42067e54950304c081b37ef66fc485287100a48ff4fb4b23e7e86 Dec 08 09:22:39 crc kubenswrapper[4776]: I1208 09:22:39.992559 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"95be142a-2a8f-4f5c-97e0-2e64e108fb8b","Type":"ContainerStarted","Data":"7d9ce438d457bfe8104b10f4d8e07e5a06ff9eb1f91bfe4e85b7934627cabe5f"} Dec 08 09:22:39 crc kubenswrapper[4776]: I1208 09:22:39.993051 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"95be142a-2a8f-4f5c-97e0-2e64e108fb8b","Type":"ContainerStarted","Data":"e7d2fed5e02b835d120ae0716fac2120ccc4c640a024f8cb59b77265be1257e6"} Dec 08 09:22:39 crc kubenswrapper[4776]: I1208 09:22:39.994817 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-626nj" event={"ID":"7c962dc3-3c64-4b5d-a740-a790a5fa10f9","Type":"ContainerStarted","Data":"373778421c4908c11862aee506bb80d21b15e77f3a1d9cf5cb226c594ebe243c"} Dec 08 09:22:39 crc kubenswrapper[4776]: I1208 09:22:39.997700 4776 generic.go:334] "Generic (PLEG): container finished" podID="eb67a5c8-c43c-467e-963e-85f3789ca32a" containerID="c1c9d838a4716e8dbc2b517b0077f7f7f2d558e4331d87c5f1273c015ade2893" exitCode=0 Dec 08 09:22:39 crc kubenswrapper[4776]: I1208 09:22:39.997803 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-7nf99" event={"ID":"eb67a5c8-c43c-467e-963e-85f3789ca32a","Type":"ContainerDied","Data":"c1c9d838a4716e8dbc2b517b0077f7f7f2d558e4331d87c5f1273c015ade2893"} Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.000147 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8tjn6" event={"ID":"f9d1e39a-4040-4f14-819f-f41e85a35143","Type":"ContainerStarted","Data":"ca6d12594b8c2f492e325734dda23a6c1d318ec30ab337e92d5788e763e8d5c8"} Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.000185 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8tjn6" event={"ID":"f9d1e39a-4040-4f14-819f-f41e85a35143","Type":"ContainerStarted","Data":"168317acfb4a72c1f6a74c15c009c9be4ce562b7fe59bd60de9663b322e4c083"} Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.003802 4776 generic.go:334] "Generic (PLEG): container finished" podID="0918e3b0-3fba-4bd7-b0fc-dae247fd9417" containerID="fb905a93aa19d41b54b74fc9b9442af2c9871a86fbd701ef2976329e07c82896" exitCode=0 Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.003842 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" event={"ID":"0918e3b0-3fba-4bd7-b0fc-dae247fd9417","Type":"ContainerDied","Data":"fb905a93aa19d41b54b74fc9b9442af2c9871a86fbd701ef2976329e07c82896"} Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.003859 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" event={"ID":"0918e3b0-3fba-4bd7-b0fc-dae247fd9417","Type":"ContainerStarted","Data":"bd2571fae1e8eb5cd93f7d4c94ce91e774f8375814ebf2e1ed80fa11e449c2b0"} Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.008083 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v4nch" event={"ID":"44d849dd-5372-4dd4-a691-036dbb925fcd","Type":"ContainerStarted","Data":"5eceb729a54e6a551f0a4e42d035d262d6b36f74757166a5eac6738b8de66051"} Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.014430 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fab03865-02a6-4cd2-bf78-22ed25534301","Type":"ContainerStarted","Data":"49d1730bcbf42067e54950304c081b37ef66fc485287100a48ff4fb4b23e7e86"} Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.025248 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2s7n9" event={"ID":"9dff1e28-5d80-48af-b348-cfd6080d3e37","Type":"ContainerStarted","Data":"3b7e52de8edb82ea85cdf139c7c9a8d2f7576c948f158819b9375584095b4669"} Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.033934 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=22.033910003 podStartE2EDuration="22.033910003s" podCreationTimestamp="2025-12-08 09:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:22:40.01961927 +0000 UTC m=+1436.282844282" watchObservedRunningTime="2025-12-08 09:22:40.033910003 +0000 UTC m=+1436.297135025" Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.048739 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-8tjn6" podStartSLOduration=3.04872508 podStartE2EDuration="3.04872508s" podCreationTimestamp="2025-12-08 09:22:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:22:40.039079052 +0000 UTC m=+1436.302304074" watchObservedRunningTime="2025-12-08 09:22:40.04872508 +0000 UTC m=+1436.311950102" Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.088600 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-v4nch" podStartSLOduration=3.08858147 podStartE2EDuration="3.08858147s" podCreationTimestamp="2025-12-08 09:22:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:22:40.080086072 +0000 UTC m=+1436.343311114" watchObservedRunningTime="2025-12-08 09:22:40.08858147 +0000 UTC m=+1436.351806492" Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.666928 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.789669 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.950079 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-ovsdbserver-sb\") pod \"eb67a5c8-c43c-467e-963e-85f3789ca32a\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.950233 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v78d\" (UniqueName: \"kubernetes.io/projected/eb67a5c8-c43c-467e-963e-85f3789ca32a-kube-api-access-9v78d\") pod \"eb67a5c8-c43c-467e-963e-85f3789ca32a\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.950365 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-dns-svc\") pod \"eb67a5c8-c43c-467e-963e-85f3789ca32a\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.950407 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-config\") pod \"eb67a5c8-c43c-467e-963e-85f3789ca32a\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.950531 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-ovsdbserver-nb\") pod \"eb67a5c8-c43c-467e-963e-85f3789ca32a\" (UID: \"eb67a5c8-c43c-467e-963e-85f3789ca32a\") " Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.965835 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb67a5c8-c43c-467e-963e-85f3789ca32a-kube-api-access-9v78d" (OuterVolumeSpecName: "kube-api-access-9v78d") pod "eb67a5c8-c43c-467e-963e-85f3789ca32a" (UID: "eb67a5c8-c43c-467e-963e-85f3789ca32a"). InnerVolumeSpecName "kube-api-access-9v78d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.980290 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb67a5c8-c43c-467e-963e-85f3789ca32a" (UID: "eb67a5c8-c43c-467e-963e-85f3789ca32a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:40 crc kubenswrapper[4776]: I1208 09:22:40.996875 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-config" (OuterVolumeSpecName: "config") pod "eb67a5c8-c43c-467e-963e-85f3789ca32a" (UID: "eb67a5c8-c43c-467e-963e-85f3789ca32a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:41 crc kubenswrapper[4776]: I1208 09:22:41.022718 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb67a5c8-c43c-467e-963e-85f3789ca32a" (UID: "eb67a5c8-c43c-467e-963e-85f3789ca32a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:41 crc kubenswrapper[4776]: I1208 09:22:41.032429 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eb67a5c8-c43c-467e-963e-85f3789ca32a" (UID: "eb67a5c8-c43c-467e-963e-85f3789ca32a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:22:41 crc kubenswrapper[4776]: I1208 09:22:41.046450 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" event={"ID":"0918e3b0-3fba-4bd7-b0fc-dae247fd9417","Type":"ContainerStarted","Data":"80c72e2eaf16d04fcdc06df7263b5d1c033cb5c847b653aa35f5e103863a0442"} Dec 08 09:22:41 crc kubenswrapper[4776]: I1208 09:22:41.049029 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:41 crc kubenswrapper[4776]: I1208 09:22:41.058026 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v78d\" (UniqueName: \"kubernetes.io/projected/eb67a5c8-c43c-467e-963e-85f3789ca32a-kube-api-access-9v78d\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:41 crc kubenswrapper[4776]: I1208 09:22:41.058058 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:41 crc kubenswrapper[4776]: I1208 09:22:41.058070 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:41 crc kubenswrapper[4776]: I1208 09:22:41.058084 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:41 crc kubenswrapper[4776]: I1208 09:22:41.059248 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb67a5c8-c43c-467e-963e-85f3789ca32a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 09:22:41 crc kubenswrapper[4776]: I1208 09:22:41.079962 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-7nf99" Dec 08 09:22:41 crc kubenswrapper[4776]: I1208 09:22:41.086667 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-7nf99" event={"ID":"eb67a5c8-c43c-467e-963e-85f3789ca32a","Type":"ContainerDied","Data":"950d941134697f1c582ce491613070300a9901cd28bb933008be0d2eaa3ac568"} Dec 08 09:22:41 crc kubenswrapper[4776]: I1208 09:22:41.086715 4776 scope.go:117] "RemoveContainer" containerID="c1c9d838a4716e8dbc2b517b0077f7f7f2d558e4331d87c5f1273c015ade2893" Dec 08 09:22:41 crc kubenswrapper[4776]: I1208 09:22:41.116935 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" podStartSLOduration=4.116913242 podStartE2EDuration="4.116913242s" podCreationTimestamp="2025-12-08 09:22:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:22:41.114535068 +0000 UTC m=+1437.377760090" watchObservedRunningTime="2025-12-08 09:22:41.116913242 +0000 UTC m=+1437.380138254" Dec 08 09:22:41 crc kubenswrapper[4776]: I1208 09:22:41.192097 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-7nf99"] Dec 08 09:22:41 crc kubenswrapper[4776]: I1208 09:22:41.311702 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-7nf99"] Dec 08 09:22:41 crc kubenswrapper[4776]: E1208 09:22:41.520663 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb67a5c8_c43c_467e_963e_85f3789ca32a.slice/crio-950d941134697f1c582ce491613070300a9901cd28bb933008be0d2eaa3ac568\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb67a5c8_c43c_467e_963e_85f3789ca32a.slice\": RecentStats: unable to find data in memory cache]" Dec 08 09:22:42 crc kubenswrapper[4776]: I1208 09:22:42.356339 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb67a5c8-c43c-467e-963e-85f3789ca32a" path="/var/lib/kubelet/pods/eb67a5c8-c43c-467e-963e-85f3789ca32a/volumes" Dec 08 09:22:42 crc kubenswrapper[4776]: I1208 09:22:42.936678 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:42 crc kubenswrapper[4776]: I1208 09:22:42.946831 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cb640491-a8e7-4f8d-b4bb-1d0124f5727f-etc-swift\") pod \"swift-storage-0\" (UID: \"cb640491-a8e7-4f8d-b4bb-1d0124f5727f\") " pod="openstack/swift-storage-0" Dec 08 09:22:43 crc kubenswrapper[4776]: I1208 09:22:43.156777 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 08 09:22:43 crc kubenswrapper[4776]: I1208 09:22:43.823486 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 08 09:22:44 crc kubenswrapper[4776]: I1208 09:22:44.112861 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cb640491-a8e7-4f8d-b4bb-1d0124f5727f","Type":"ContainerStarted","Data":"107c0cf192ad38e0e720d82964b4d5882543a4c856acbd3ff0e95216d66accb2"} Dec 08 09:22:44 crc kubenswrapper[4776]: I1208 09:22:44.185287 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:47 crc kubenswrapper[4776]: I1208 09:22:47.162498 4776 generic.go:334] "Generic (PLEG): container finished" podID="44d849dd-5372-4dd4-a691-036dbb925fcd" containerID="5eceb729a54e6a551f0a4e42d035d262d6b36f74757166a5eac6738b8de66051" exitCode=0 Dec 08 09:22:47 crc kubenswrapper[4776]: I1208 09:22:47.162640 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v4nch" event={"ID":"44d849dd-5372-4dd4-a691-036dbb925fcd","Type":"ContainerDied","Data":"5eceb729a54e6a551f0a4e42d035d262d6b36f74757166a5eac6738b8de66051"} Dec 08 09:22:48 crc kubenswrapper[4776]: I1208 09:22:48.387466 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:22:48 crc kubenswrapper[4776]: I1208 09:22:48.483793 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5ss7n"] Dec 08 09:22:48 crc kubenswrapper[4776]: I1208 09:22:48.484044 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-5ss7n" podUID="8054440d-20b3-498d-80a6-da7ee23c9864" containerName="dnsmasq-dns" containerID="cri-o://8988c6790b9c2597ec460d682361880bb2dc95081eec3e20e7d055bdf75ec56f" gracePeriod=10 Dec 08 09:22:49 crc kubenswrapper[4776]: I1208 09:22:49.186859 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:49 crc kubenswrapper[4776]: I1208 09:22:49.187349 4776 generic.go:334] "Generic (PLEG): container finished" podID="8054440d-20b3-498d-80a6-da7ee23c9864" containerID="8988c6790b9c2597ec460d682361880bb2dc95081eec3e20e7d055bdf75ec56f" exitCode=0 Dec 08 09:22:49 crc kubenswrapper[4776]: I1208 09:22:49.187399 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5ss7n" event={"ID":"8054440d-20b3-498d-80a6-da7ee23c9864","Type":"ContainerDied","Data":"8988c6790b9c2597ec460d682361880bb2dc95081eec3e20e7d055bdf75ec56f"} Dec 08 09:22:49 crc kubenswrapper[4776]: I1208 09:22:49.196683 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:50 crc kubenswrapper[4776]: I1208 09:22:50.199482 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 08 09:22:50 crc kubenswrapper[4776]: I1208 09:22:50.270765 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-5ss7n" podUID="8054440d-20b3-498d-80a6-da7ee23c9864" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.157:5353: connect: connection refused" Dec 08 09:22:55 crc kubenswrapper[4776]: I1208 09:22:55.270446 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-5ss7n" podUID="8054440d-20b3-498d-80a6-da7ee23c9864" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.157:5353: connect: connection refused" Dec 08 09:22:58 crc kubenswrapper[4776]: I1208 09:22:58.601928 4776 generic.go:334] "Generic (PLEG): container finished" podID="4a920788-f8d6-4c42-84f6-d842d9bf9a17" containerID="4f24f8968ed17fa81f6f8869195d3cc6621d449566b91c6fe0cb95b1375dcc9d" exitCode=0 Dec 08 09:22:58 crc kubenswrapper[4776]: I1208 09:22:58.602008 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9hw46" event={"ID":"4a920788-f8d6-4c42-84f6-d842d9bf9a17","Type":"ContainerDied","Data":"4f24f8968ed17fa81f6f8869195d3cc6621d449566b91c6fe0cb95b1375dcc9d"} Dec 08 09:23:00 crc kubenswrapper[4776]: I1208 09:23:00.269654 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-5ss7n" podUID="8054440d-20b3-498d-80a6-da7ee23c9864" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.157:5353: connect: connection refused" Dec 08 09:23:00 crc kubenswrapper[4776]: I1208 09:23:00.270043 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:23:00 crc kubenswrapper[4776]: E1208 09:23:00.809798 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 08 09:23:00 crc kubenswrapper[4776]: E1208 09:23:00.809957 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x8z6p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-xksjb_openstack(6d7c64ff-eec0-48d3-bba8-724158787096): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:23:00 crc kubenswrapper[4776]: E1208 09:23:00.811738 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-xksjb" podUID="6d7c64ff-eec0-48d3-bba8-724158787096" Dec 08 09:23:00 crc kubenswrapper[4776]: I1208 09:23:00.848326 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.018532 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-fernet-keys\") pod \"44d849dd-5372-4dd4-a691-036dbb925fcd\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.018594 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-scripts\") pod \"44d849dd-5372-4dd4-a691-036dbb925fcd\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.018698 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-config-data\") pod \"44d849dd-5372-4dd4-a691-036dbb925fcd\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.018790 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-credential-keys\") pod \"44d849dd-5372-4dd4-a691-036dbb925fcd\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.018871 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kck9r\" (UniqueName: \"kubernetes.io/projected/44d849dd-5372-4dd4-a691-036dbb925fcd-kube-api-access-kck9r\") pod \"44d849dd-5372-4dd4-a691-036dbb925fcd\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.018955 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-combined-ca-bundle\") pod \"44d849dd-5372-4dd4-a691-036dbb925fcd\" (UID: \"44d849dd-5372-4dd4-a691-036dbb925fcd\") " Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.026335 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d849dd-5372-4dd4-a691-036dbb925fcd-kube-api-access-kck9r" (OuterVolumeSpecName: "kube-api-access-kck9r") pod "44d849dd-5372-4dd4-a691-036dbb925fcd" (UID: "44d849dd-5372-4dd4-a691-036dbb925fcd"). InnerVolumeSpecName "kube-api-access-kck9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.026432 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-scripts" (OuterVolumeSpecName: "scripts") pod "44d849dd-5372-4dd4-a691-036dbb925fcd" (UID: "44d849dd-5372-4dd4-a691-036dbb925fcd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.033540 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "44d849dd-5372-4dd4-a691-036dbb925fcd" (UID: "44d849dd-5372-4dd4-a691-036dbb925fcd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.034734 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "44d849dd-5372-4dd4-a691-036dbb925fcd" (UID: "44d849dd-5372-4dd4-a691-036dbb925fcd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.055726 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44d849dd-5372-4dd4-a691-036dbb925fcd" (UID: "44d849dd-5372-4dd4-a691-036dbb925fcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.056152 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-config-data" (OuterVolumeSpecName: "config-data") pod "44d849dd-5372-4dd4-a691-036dbb925fcd" (UID: "44d849dd-5372-4dd4-a691-036dbb925fcd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.121733 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.122070 4776 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.122081 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.122090 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.122103 4776 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/44d849dd-5372-4dd4-a691-036dbb925fcd-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.122113 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kck9r\" (UniqueName: \"kubernetes.io/projected/44d849dd-5372-4dd4-a691-036dbb925fcd-kube-api-access-kck9r\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.646677 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v4nch" Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.660438 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v4nch" event={"ID":"44d849dd-5372-4dd4-a691-036dbb925fcd","Type":"ContainerDied","Data":"93494ecd364c05c66216b5c2298f9028833cfe07f08c157b9c0ea2623d43351d"} Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.660487 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93494ecd364c05c66216b5c2298f9028833cfe07f08c157b9c0ea2623d43351d" Dec 08 09:23:01 crc kubenswrapper[4776]: E1208 09:23:01.661039 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-xksjb" podUID="6d7c64ff-eec0-48d3-bba8-724158787096" Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.931460 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-v4nch"] Dec 08 09:23:01 crc kubenswrapper[4776]: I1208 09:23:01.964153 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-v4nch"] Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.020715 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ns7rc"] Dec 08 09:23:02 crc kubenswrapper[4776]: E1208 09:23:02.021248 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d849dd-5372-4dd4-a691-036dbb925fcd" containerName="keystone-bootstrap" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.021270 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d849dd-5372-4dd4-a691-036dbb925fcd" containerName="keystone-bootstrap" Dec 08 09:23:02 crc kubenswrapper[4776]: E1208 09:23:02.021291 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb67a5c8-c43c-467e-963e-85f3789ca32a" containerName="init" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.021299 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb67a5c8-c43c-467e-963e-85f3789ca32a" containerName="init" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.021548 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d849dd-5372-4dd4-a691-036dbb925fcd" containerName="keystone-bootstrap" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.021576 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb67a5c8-c43c-467e-963e-85f3789ca32a" containerName="init" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.022382 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.025045 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.025845 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.025990 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.026229 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wz4bj" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.026367 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.033749 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ns7rc"] Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.148264 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-combined-ca-bundle\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.148327 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-fernet-keys\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.148528 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fq2d\" (UniqueName: \"kubernetes.io/projected/913d2881-9323-4503-b364-05de889fd095-kube-api-access-8fq2d\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.148898 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-config-data\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.149024 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-scripts\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.149191 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-credential-keys\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.250811 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-config-data\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.250917 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-scripts\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.250968 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-credential-keys\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.251001 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-combined-ca-bundle\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.251029 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-fernet-keys\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.251055 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fq2d\" (UniqueName: \"kubernetes.io/projected/913d2881-9323-4503-b364-05de889fd095-kube-api-access-8fq2d\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.257802 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-credential-keys\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.257929 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-combined-ca-bundle\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.258746 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-config-data\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.264485 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-scripts\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.267806 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-fernet-keys\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.278308 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fq2d\" (UniqueName: \"kubernetes.io/projected/913d2881-9323-4503-b364-05de889fd095-kube-api-access-8fq2d\") pod \"keystone-bootstrap-ns7rc\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.338351 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:02 crc kubenswrapper[4776]: I1208 09:23:02.354505 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d849dd-5372-4dd4-a691-036dbb925fcd" path="/var/lib/kubelet/pods/44d849dd-5372-4dd4-a691-036dbb925fcd/volumes" Dec 08 09:23:05 crc kubenswrapper[4776]: I1208 09:23:05.691707 4776 generic.go:334] "Generic (PLEG): container finished" podID="f9d1e39a-4040-4f14-819f-f41e85a35143" containerID="ca6d12594b8c2f492e325734dda23a6c1d318ec30ab337e92d5788e763e8d5c8" exitCode=0 Dec 08 09:23:05 crc kubenswrapper[4776]: I1208 09:23:05.691805 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8tjn6" event={"ID":"f9d1e39a-4040-4f14-819f-f41e85a35143","Type":"ContainerDied","Data":"ca6d12594b8c2f492e325734dda23a6c1d318ec30ab337e92d5788e763e8d5c8"} Dec 08 09:23:07 crc kubenswrapper[4776]: E1208 09:23:07.765715 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 08 09:23:07 crc kubenswrapper[4776]: E1208 09:23:07.766463 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vwnsq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-2s7n9_openstack(9dff1e28-5d80-48af-b348-cfd6080d3e37): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:23:07 crc kubenswrapper[4776]: E1208 09:23:07.768629 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-2s7n9" podUID="9dff1e28-5d80-48af-b348-cfd6080d3e37" Dec 08 09:23:07 crc kubenswrapper[4776]: I1208 09:23:07.866929 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9hw46" Dec 08 09:23:07 crc kubenswrapper[4776]: I1208 09:23:07.877478 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:23:07 crc kubenswrapper[4776]: I1208 09:23:07.973453 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sck9g\" (UniqueName: \"kubernetes.io/projected/4a920788-f8d6-4c42-84f6-d842d9bf9a17-kube-api-access-sck9g\") pod \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\" (UID: \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\") " Dec 08 09:23:07 crc kubenswrapper[4776]: I1208 09:23:07.973537 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-ovsdbserver-sb\") pod \"8054440d-20b3-498d-80a6-da7ee23c9864\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " Dec 08 09:23:07 crc kubenswrapper[4776]: I1208 09:23:07.973595 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-combined-ca-bundle\") pod \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\" (UID: \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\") " Dec 08 09:23:07 crc kubenswrapper[4776]: I1208 09:23:07.973682 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-db-sync-config-data\") pod \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\" (UID: \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\") " Dec 08 09:23:07 crc kubenswrapper[4776]: I1208 09:23:07.973760 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cq95\" (UniqueName: \"kubernetes.io/projected/8054440d-20b3-498d-80a6-da7ee23c9864-kube-api-access-9cq95\") pod \"8054440d-20b3-498d-80a6-da7ee23c9864\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " Dec 08 09:23:07 crc kubenswrapper[4776]: I1208 09:23:07.976396 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-config\") pod \"8054440d-20b3-498d-80a6-da7ee23c9864\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " Dec 08 09:23:07 crc kubenswrapper[4776]: I1208 09:23:07.976455 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-ovsdbserver-nb\") pod \"8054440d-20b3-498d-80a6-da7ee23c9864\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " Dec 08 09:23:07 crc kubenswrapper[4776]: I1208 09:23:07.976473 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-config-data\") pod \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\" (UID: \"4a920788-f8d6-4c42-84f6-d842d9bf9a17\") " Dec 08 09:23:07 crc kubenswrapper[4776]: I1208 09:23:07.976494 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-dns-svc\") pod \"8054440d-20b3-498d-80a6-da7ee23c9864\" (UID: \"8054440d-20b3-498d-80a6-da7ee23c9864\") " Dec 08 09:23:07 crc kubenswrapper[4776]: I1208 09:23:07.982952 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4a920788-f8d6-4c42-84f6-d842d9bf9a17" (UID: "4a920788-f8d6-4c42-84f6-d842d9bf9a17"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:07 crc kubenswrapper[4776]: I1208 09:23:07.988425 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8054440d-20b3-498d-80a6-da7ee23c9864-kube-api-access-9cq95" (OuterVolumeSpecName: "kube-api-access-9cq95") pod "8054440d-20b3-498d-80a6-da7ee23c9864" (UID: "8054440d-20b3-498d-80a6-da7ee23c9864"). InnerVolumeSpecName "kube-api-access-9cq95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:07 crc kubenswrapper[4776]: I1208 09:23:07.993767 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a920788-f8d6-4c42-84f6-d842d9bf9a17-kube-api-access-sck9g" (OuterVolumeSpecName: "kube-api-access-sck9g") pod "4a920788-f8d6-4c42-84f6-d842d9bf9a17" (UID: "4a920788-f8d6-4c42-84f6-d842d9bf9a17"). InnerVolumeSpecName "kube-api-access-sck9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.030027 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8054440d-20b3-498d-80a6-da7ee23c9864" (UID: "8054440d-20b3-498d-80a6-da7ee23c9864"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.033069 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-config-data" (OuterVolumeSpecName: "config-data") pod "4a920788-f8d6-4c42-84f6-d842d9bf9a17" (UID: "4a920788-f8d6-4c42-84f6-d842d9bf9a17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.034168 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a920788-f8d6-4c42-84f6-d842d9bf9a17" (UID: "4a920788-f8d6-4c42-84f6-d842d9bf9a17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.051508 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8054440d-20b3-498d-80a6-da7ee23c9864" (UID: "8054440d-20b3-498d-80a6-da7ee23c9864"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.052998 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8054440d-20b3-498d-80a6-da7ee23c9864" (UID: "8054440d-20b3-498d-80a6-da7ee23c9864"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.057503 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-config" (OuterVolumeSpecName: "config") pod "8054440d-20b3-498d-80a6-da7ee23c9864" (UID: "8054440d-20b3-498d-80a6-da7ee23c9864"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.078665 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.078694 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.078705 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.078714 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.078724 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sck9g\" (UniqueName: \"kubernetes.io/projected/4a920788-f8d6-4c42-84f6-d842d9bf9a17-kube-api-access-sck9g\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.078733 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8054440d-20b3-498d-80a6-da7ee23c9864-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.078741 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.078752 4776 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a920788-f8d6-4c42-84f6-d842d9bf9a17-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.078761 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cq95\" (UniqueName: \"kubernetes.io/projected/8054440d-20b3-498d-80a6-da7ee23c9864-kube-api-access-9cq95\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:08 crc kubenswrapper[4776]: E1208 09:23:08.243521 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 08 09:23:08 crc kubenswrapper[4776]: E1208 09:23:08.243740 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n586hb5h65fh5dch678h694hcfh5c4hb7h586h5f7h7fh67dh54fhddh697hcfh647h9dh5fch646h6hcfh557hf9h67fh65h5b4h688h5cfh68bh56q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wnvjn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(fab03865-02a6-4cd2-bf78-22ed25534301): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.247614 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8tjn6" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.383294 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9d1e39a-4040-4f14-819f-f41e85a35143-config\") pod \"f9d1e39a-4040-4f14-819f-f41e85a35143\" (UID: \"f9d1e39a-4040-4f14-819f-f41e85a35143\") " Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.383377 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9d1e39a-4040-4f14-819f-f41e85a35143-combined-ca-bundle\") pod \"f9d1e39a-4040-4f14-819f-f41e85a35143\" (UID: \"f9d1e39a-4040-4f14-819f-f41e85a35143\") " Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.383564 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48vll\" (UniqueName: \"kubernetes.io/projected/f9d1e39a-4040-4f14-819f-f41e85a35143-kube-api-access-48vll\") pod \"f9d1e39a-4040-4f14-819f-f41e85a35143\" (UID: \"f9d1e39a-4040-4f14-819f-f41e85a35143\") " Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.387273 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9d1e39a-4040-4f14-819f-f41e85a35143-kube-api-access-48vll" (OuterVolumeSpecName: "kube-api-access-48vll") pod "f9d1e39a-4040-4f14-819f-f41e85a35143" (UID: "f9d1e39a-4040-4f14-819f-f41e85a35143"). InnerVolumeSpecName "kube-api-access-48vll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.410446 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9d1e39a-4040-4f14-819f-f41e85a35143-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9d1e39a-4040-4f14-819f-f41e85a35143" (UID: "f9d1e39a-4040-4f14-819f-f41e85a35143"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.413627 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9d1e39a-4040-4f14-819f-f41e85a35143-config" (OuterVolumeSpecName: "config") pod "f9d1e39a-4040-4f14-819f-f41e85a35143" (UID: "f9d1e39a-4040-4f14-819f-f41e85a35143"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.486558 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9d1e39a-4040-4f14-819f-f41e85a35143-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.486589 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9d1e39a-4040-4f14-819f-f41e85a35143-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.486599 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48vll\" (UniqueName: \"kubernetes.io/projected/f9d1e39a-4040-4f14-819f-f41e85a35143-kube-api-access-48vll\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:08 crc kubenswrapper[4776]: E1208 09:23:08.706044 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 08 09:23:08 crc kubenswrapper[4776]: E1208 09:23:08.706245 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wpcz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-626nj_openstack(7c962dc3-3c64-4b5d-a740-a790a5fa10f9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:23:08 crc kubenswrapper[4776]: E1208 09:23:08.707863 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-626nj" podUID="7c962dc3-3c64-4b5d-a740-a790a5fa10f9" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.752989 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5ss7n" event={"ID":"8054440d-20b3-498d-80a6-da7ee23c9864","Type":"ContainerDied","Data":"dd44f9d3151110b78963bfee56ccd62294aee5568dca6a211d85a23a80d6b9a9"} Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.753055 4776 scope.go:117] "RemoveContainer" containerID="8988c6790b9c2597ec460d682361880bb2dc95081eec3e20e7d055bdf75ec56f" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.753047 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-5ss7n" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.755676 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8tjn6" event={"ID":"f9d1e39a-4040-4f14-819f-f41e85a35143","Type":"ContainerDied","Data":"168317acfb4a72c1f6a74c15c009c9be4ce562b7fe59bd60de9663b322e4c083"} Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.755809 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="168317acfb4a72c1f6a74c15c009c9be4ce562b7fe59bd60de9663b322e4c083" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.755901 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8tjn6" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.762000 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9hw46" event={"ID":"4a920788-f8d6-4c42-84f6-d842d9bf9a17","Type":"ContainerDied","Data":"e8b21522402e354426ffa34371a20fa99804cfef89f8cc8231177b02e71230f3"} Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.762049 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8b21522402e354426ffa34371a20fa99804cfef89f8cc8231177b02e71230f3" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.762648 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9hw46" Dec 08 09:23:08 crc kubenswrapper[4776]: E1208 09:23:08.764483 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-2s7n9" podUID="9dff1e28-5d80-48af-b348-cfd6080d3e37" Dec 08 09:23:08 crc kubenswrapper[4776]: E1208 09:23:08.764742 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-626nj" podUID="7c962dc3-3c64-4b5d-a740-a790a5fa10f9" Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.828027 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5ss7n"] Dec 08 09:23:08 crc kubenswrapper[4776]: I1208 09:23:08.836820 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5ss7n"] Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.005373 4776 scope.go:117] "RemoveContainer" containerID="24269e54387f3d988a0353b4b6e6be70c886f9a8f303c9ff71cc0dd5ff8ac2ed" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.327166 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-n6xl6"] Dec 08 09:23:09 crc kubenswrapper[4776]: E1208 09:23:09.328407 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8054440d-20b3-498d-80a6-da7ee23c9864" containerName="dnsmasq-dns" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.328443 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8054440d-20b3-498d-80a6-da7ee23c9864" containerName="dnsmasq-dns" Dec 08 09:23:09 crc kubenswrapper[4776]: E1208 09:23:09.328460 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8054440d-20b3-498d-80a6-da7ee23c9864" containerName="init" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.328468 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8054440d-20b3-498d-80a6-da7ee23c9864" containerName="init" Dec 08 09:23:09 crc kubenswrapper[4776]: E1208 09:23:09.328493 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a920788-f8d6-4c42-84f6-d842d9bf9a17" containerName="glance-db-sync" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.328499 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a920788-f8d6-4c42-84f6-d842d9bf9a17" containerName="glance-db-sync" Dec 08 09:23:09 crc kubenswrapper[4776]: E1208 09:23:09.328530 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d1e39a-4040-4f14-819f-f41e85a35143" containerName="neutron-db-sync" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.328536 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d1e39a-4040-4f14-819f-f41e85a35143" containerName="neutron-db-sync" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.328769 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8054440d-20b3-498d-80a6-da7ee23c9864" containerName="dnsmasq-dns" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.328782 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a920788-f8d6-4c42-84f6-d842d9bf9a17" containerName="glance-db-sync" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.328801 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9d1e39a-4040-4f14-819f-f41e85a35143" containerName="neutron-db-sync" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.330029 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.367635 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-n6xl6"] Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.412918 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-n6xl6\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.412956 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-n6xl6\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.413038 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-config\") pod \"dnsmasq-dns-f84976bdf-n6xl6\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.413095 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl4hv\" (UniqueName: \"kubernetes.io/projected/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-kube-api-access-zl4hv\") pod \"dnsmasq-dns-f84976bdf-n6xl6\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.413123 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-dns-svc\") pod \"dnsmasq-dns-f84976bdf-n6xl6\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.480389 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-n6xl6"] Dec 08 09:23:09 crc kubenswrapper[4776]: E1208 09:23:09.481580 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-zl4hv ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" podUID="ea69d170-4f2e-4ce9-a7c9-46a448f33b3c" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.514574 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-dns-svc\") pod \"dnsmasq-dns-f84976bdf-n6xl6\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.514797 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-n6xl6\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.514822 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-n6xl6\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.514897 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-config\") pod \"dnsmasq-dns-f84976bdf-n6xl6\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.514977 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl4hv\" (UniqueName: \"kubernetes.io/projected/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-kube-api-access-zl4hv\") pod \"dnsmasq-dns-f84976bdf-n6xl6\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.516318 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-dns-svc\") pod \"dnsmasq-dns-f84976bdf-n6xl6\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.517188 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-n6xl6\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.518311 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-n6xl6\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.524260 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fb745b69-9lrkl"] Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.527759 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-config\") pod \"dnsmasq-dns-f84976bdf-n6xl6\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.531874 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.549093 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-9lrkl"] Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.550135 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl4hv\" (UniqueName: \"kubernetes.io/projected/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-kube-api-access-zl4hv\") pod \"dnsmasq-dns-f84976bdf-n6xl6\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.611432 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ns7rc"] Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.616772 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v898k\" (UniqueName: \"kubernetes.io/projected/001c9704-cf27-4a31-8a61-3e5ce2e272eb-kube-api-access-v898k\") pod \"dnsmasq-dns-fb745b69-9lrkl\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.618035 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-config\") pod \"dnsmasq-dns-fb745b69-9lrkl\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.618087 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-9lrkl\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.618141 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-9lrkl\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.618280 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-dns-svc\") pod \"dnsmasq-dns-fb745b69-9lrkl\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.710825 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-775f79cd-lq4qd"] Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.713841 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.719565 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.719754 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.719878 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.719993 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v5jm7" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.720485 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-dns-svc\") pod \"dnsmasq-dns-fb745b69-9lrkl\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.720565 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v898k\" (UniqueName: \"kubernetes.io/projected/001c9704-cf27-4a31-8a61-3e5ce2e272eb-kube-api-access-v898k\") pod \"dnsmasq-dns-fb745b69-9lrkl\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.720586 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-config\") pod \"dnsmasq-dns-fb745b69-9lrkl\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.720633 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-9lrkl\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.720684 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-9lrkl\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.721531 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-9lrkl\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.721707 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-dns-svc\") pod \"dnsmasq-dns-fb745b69-9lrkl\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.722078 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-config\") pod \"dnsmasq-dns-fb745b69-9lrkl\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.722475 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-9lrkl\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.725896 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-775f79cd-lq4qd"] Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.754007 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v898k\" (UniqueName: \"kubernetes.io/projected/001c9704-cf27-4a31-8a61-3e5ce2e272eb-kube-api-access-v898k\") pod \"dnsmasq-dns-fb745b69-9lrkl\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.783889 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zhhw6" event={"ID":"f4a95eb4-92c1-4eff-940b-37f74dd3dc18","Type":"ContainerStarted","Data":"8833fa7626a819ff1247acadca4b9b2a9355a6d206fca9d0ed67d2f0cae0998b"} Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.800778 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ns7rc" event={"ID":"913d2881-9323-4503-b364-05de889fd095","Type":"ContainerStarted","Data":"44926e60f286229e0328f2084263b89c80c36aa4f67695a2f6e07e07667a89cc"} Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.803034 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-zhhw6" podStartSLOduration=2.470824615 podStartE2EDuration="32.803020675s" podCreationTimestamp="2025-12-08 09:22:37 +0000 UTC" firstStartedPulling="2025-12-08 09:22:38.641970486 +0000 UTC m=+1434.905195508" lastFinishedPulling="2025-12-08 09:23:08.974166536 +0000 UTC m=+1465.237391568" observedRunningTime="2025-12-08 09:23:09.80059862 +0000 UTC m=+1466.063823642" watchObservedRunningTime="2025-12-08 09:23:09.803020675 +0000 UTC m=+1466.066245697" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.821914 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-ovndb-tls-certs\") pod \"neutron-775f79cd-lq4qd\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.822266 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-config\") pod \"neutron-775f79cd-lq4qd\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.822370 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-combined-ca-bundle\") pod \"neutron-775f79cd-lq4qd\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.822531 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wgjl\" (UniqueName: \"kubernetes.io/projected/84196301-9fa2-4acb-9a49-d87fdb571dfe-kube-api-access-9wgjl\") pod \"neutron-775f79cd-lq4qd\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.822641 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-httpd-config\") pod \"neutron-775f79cd-lq4qd\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.826196 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.826292 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cb640491-a8e7-4f8d-b4bb-1d0124f5727f","Type":"ContainerStarted","Data":"9f34db37fc55054f483f214f8f7cb9c91f099c36eca8869d1ab01354824210e6"} Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.877640 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.908522 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.923945 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-ovndb-tls-certs\") pod \"neutron-775f79cd-lq4qd\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.924206 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-config\") pod \"neutron-775f79cd-lq4qd\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.924360 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-combined-ca-bundle\") pod \"neutron-775f79cd-lq4qd\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.924524 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wgjl\" (UniqueName: \"kubernetes.io/projected/84196301-9fa2-4acb-9a49-d87fdb571dfe-kube-api-access-9wgjl\") pod \"neutron-775f79cd-lq4qd\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.924649 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-httpd-config\") pod \"neutron-775f79cd-lq4qd\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.931922 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-ovndb-tls-certs\") pod \"neutron-775f79cd-lq4qd\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.934889 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-config\") pod \"neutron-775f79cd-lq4qd\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.935518 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-httpd-config\") pod \"neutron-775f79cd-lq4qd\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.937359 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-combined-ca-bundle\") pod \"neutron-775f79cd-lq4qd\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:09 crc kubenswrapper[4776]: I1208 09:23:09.950295 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wgjl\" (UniqueName: \"kubernetes.io/projected/84196301-9fa2-4acb-9a49-d87fdb571dfe-kube-api-access-9wgjl\") pod \"neutron-775f79cd-lq4qd\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.029882 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-ovsdbserver-sb\") pod \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.029920 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-dns-svc\") pod \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.030004 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-ovsdbserver-nb\") pod \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.030137 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl4hv\" (UniqueName: \"kubernetes.io/projected/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-kube-api-access-zl4hv\") pod \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.030213 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-config\") pod \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\" (UID: \"ea69d170-4f2e-4ce9-a7c9-46a448f33b3c\") " Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.030924 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ea69d170-4f2e-4ce9-a7c9-46a448f33b3c" (UID: "ea69d170-4f2e-4ce9-a7c9-46a448f33b3c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.031596 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ea69d170-4f2e-4ce9-a7c9-46a448f33b3c" (UID: "ea69d170-4f2e-4ce9-a7c9-46a448f33b3c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.031648 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ea69d170-4f2e-4ce9-a7c9-46a448f33b3c" (UID: "ea69d170-4f2e-4ce9-a7c9-46a448f33b3c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.031959 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-config" (OuterVolumeSpecName: "config") pod "ea69d170-4f2e-4ce9-a7c9-46a448f33b3c" (UID: "ea69d170-4f2e-4ce9-a7c9-46a448f33b3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.040327 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-kube-api-access-zl4hv" (OuterVolumeSpecName: "kube-api-access-zl4hv") pod "ea69d170-4f2e-4ce9-a7c9-46a448f33b3c" (UID: "ea69d170-4f2e-4ce9-a7c9-46a448f33b3c"). InnerVolumeSpecName "kube-api-access-zl4hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.133376 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl4hv\" (UniqueName: \"kubernetes.io/projected/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-kube-api-access-zl4hv\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.133628 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.133639 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.133648 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.133655 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.205718 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.227187 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.229190 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.232420 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.232574 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.232747 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hszn2" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.241673 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.272407 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-5ss7n" podUID="8054440d-20b3-498d-80a6-da7ee23c9864" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.157:5353: i/o timeout" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.337287 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6047fbdd-490d-4da5-ac61-58c28c7aa66c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.337386 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-config-data\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.337408 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.337461 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.338105 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-scripts\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.338181 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6047fbdd-490d-4da5-ac61-58c28c7aa66c-logs\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.338286 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx875\" (UniqueName: \"kubernetes.io/projected/6047fbdd-490d-4da5-ac61-58c28c7aa66c-kube-api-access-lx875\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.385379 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8054440d-20b3-498d-80a6-da7ee23c9864" path="/var/lib/kubelet/pods/8054440d-20b3-498d-80a6-da7ee23c9864/volumes" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.455765 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6047fbdd-490d-4da5-ac61-58c28c7aa66c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.455815 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-config-data\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.455830 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.455853 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.455895 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-scripts\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.455944 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6047fbdd-490d-4da5-ac61-58c28c7aa66c-logs\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.455989 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx875\" (UniqueName: \"kubernetes.io/projected/6047fbdd-490d-4da5-ac61-58c28c7aa66c-kube-api-access-lx875\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.457756 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6047fbdd-490d-4da5-ac61-58c28c7aa66c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.464458 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.465510 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6047fbdd-490d-4da5-ac61-58c28c7aa66c-logs\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.483140 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-scripts\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.492607 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.499312 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx875\" (UniqueName: \"kubernetes.io/projected/6047fbdd-490d-4da5-ac61-58c28c7aa66c-kube-api-access-lx875\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.513463 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-config-data\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.526489 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.551220 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.553002 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.569905 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.579105 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.609236 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.666766 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-9lrkl"] Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.667140 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.667192 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.667301 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f46e578a-f471-4430-b7a1-095d9d295e2f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.667347 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fnxm\" (UniqueName: \"kubernetes.io/projected/f46e578a-f471-4430-b7a1-095d9d295e2f-kube-api-access-9fnxm\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.667400 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.667414 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f46e578a-f471-4430-b7a1-095d9d295e2f-logs\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.667464 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.770473 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fnxm\" (UniqueName: \"kubernetes.io/projected/f46e578a-f471-4430-b7a1-095d9d295e2f-kube-api-access-9fnxm\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.770548 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f46e578a-f471-4430-b7a1-095d9d295e2f-logs\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.770571 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.770612 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.770658 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.770687 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.770761 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f46e578a-f471-4430-b7a1-095d9d295e2f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.771287 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f46e578a-f471-4430-b7a1-095d9d295e2f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.779990 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.780110 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.780185 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f46e578a-f471-4430-b7a1-095d9d295e2f-logs\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.783891 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.794585 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.811823 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fnxm\" (UniqueName: \"kubernetes.io/projected/f46e578a-f471-4430-b7a1-095d9d295e2f-kube-api-access-9fnxm\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.875981 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ns7rc" event={"ID":"913d2881-9323-4503-b364-05de889fd095","Type":"ContainerStarted","Data":"e1e837bebb9bb5b35dabc32043d3992098ffa114681c7c6d190559d35cb0ab70"} Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.893332 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.914140 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cb640491-a8e7-4f8d-b4bb-1d0124f5727f","Type":"ContainerStarted","Data":"4e8721bc80b5a348c170f5fd7935c7b093dffbb50aaede539b470601f46aaacd"} Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.914198 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cb640491-a8e7-4f8d-b4bb-1d0124f5727f","Type":"ContainerStarted","Data":"d7682a31be88e63041fb915f0d35f97f8cd12e3ea33a87346f5a5acb750bc72d"} Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.916001 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-n6xl6" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.916350 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ns7rc" podStartSLOduration=9.916312785 podStartE2EDuration="9.916312785s" podCreationTimestamp="2025-12-08 09:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:10.904527819 +0000 UTC m=+1467.167752841" watchObservedRunningTime="2025-12-08 09:23:10.916312785 +0000 UTC m=+1467.179537807" Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.916936 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-9lrkl" event={"ID":"001c9704-cf27-4a31-8a61-3e5ce2e272eb","Type":"ContainerStarted","Data":"81720cb0fb50ed1af9365af7b03959b40b1482af143ff9ec6760350a32cde2cd"} Dec 08 09:23:10 crc kubenswrapper[4776]: I1208 09:23:10.921435 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 09:23:11 crc kubenswrapper[4776]: I1208 09:23:11.025525 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-775f79cd-lq4qd"] Dec 08 09:23:11 crc kubenswrapper[4776]: I1208 09:23:11.057473 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-n6xl6"] Dec 08 09:23:11 crc kubenswrapper[4776]: I1208 09:23:11.073279 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-n6xl6"] Dec 08 09:23:12 crc kubenswrapper[4776]: I1208 09:23:12.354920 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea69d170-4f2e-4ce9-a7c9-46a448f33b3c" path="/var/lib/kubelet/pods/ea69d170-4f2e-4ce9-a7c9-46a448f33b3c/volumes" Dec 08 09:23:12 crc kubenswrapper[4776]: W1208 09:23:12.553849 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice/crio-64fc18eda76c24ac93cec50dfdf9324c0d1be82adbe44a3cfcf13acd8a740bb6 WatchSource:0}: Error finding container 64fc18eda76c24ac93cec50dfdf9324c0d1be82adbe44a3cfcf13acd8a740bb6: Status 404 returned error can't find the container with id 64fc18eda76c24ac93cec50dfdf9324c0d1be82adbe44a3cfcf13acd8a740bb6 Dec 08 09:23:12 crc kubenswrapper[4776]: I1208 09:23:12.935986 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-775f79cd-lq4qd" event={"ID":"84196301-9fa2-4acb-9a49-d87fdb571dfe","Type":"ContainerStarted","Data":"64fc18eda76c24ac93cec50dfdf9324c0d1be82adbe44a3cfcf13acd8a740bb6"} Dec 08 09:23:13 crc kubenswrapper[4776]: I1208 09:23:13.182971 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 09:23:13 crc kubenswrapper[4776]: W1208 09:23:13.185791 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6047fbdd_490d_4da5_ac61_58c28c7aa66c.slice/crio-393fbc9ce23f3da8f4942b4c971b1b8098af680ee665fda9cca9120ff563c87c WatchSource:0}: Error finding container 393fbc9ce23f3da8f4942b4c971b1b8098af680ee665fda9cca9120ff563c87c: Status 404 returned error can't find the container with id 393fbc9ce23f3da8f4942b4c971b1b8098af680ee665fda9cca9120ff563c87c Dec 08 09:23:13 crc kubenswrapper[4776]: I1208 09:23:13.335341 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 09:23:13 crc kubenswrapper[4776]: W1208 09:23:13.342779 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf46e578a_f471_4430_b7a1_095d9d295e2f.slice/crio-b70d0b1d2c70f2ac5787fb538ec58490637d8ff7ab5a04cfbe8070d26f154b04 WatchSource:0}: Error finding container b70d0b1d2c70f2ac5787fb538ec58490637d8ff7ab5a04cfbe8070d26f154b04: Status 404 returned error can't find the container with id b70d0b1d2c70f2ac5787fb538ec58490637d8ff7ab5a04cfbe8070d26f154b04 Dec 08 09:23:13 crc kubenswrapper[4776]: I1208 09:23:13.956129 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f46e578a-f471-4430-b7a1-095d9d295e2f","Type":"ContainerStarted","Data":"b70d0b1d2c70f2ac5787fb538ec58490637d8ff7ab5a04cfbe8070d26f154b04"} Dec 08 09:23:13 crc kubenswrapper[4776]: I1208 09:23:13.966929 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cb640491-a8e7-4f8d-b4bb-1d0124f5727f","Type":"ContainerStarted","Data":"811031d425771fde5afd5d2e98fc85cc5f5db8f30c9d67d87dc0b73e3637ea3e"} Dec 08 09:23:13 crc kubenswrapper[4776]: I1208 09:23:13.969528 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fab03865-02a6-4cd2-bf78-22ed25534301","Type":"ContainerStarted","Data":"007fff4bf45758cb30afb7d09d5451695d3f321328fa25d9b157ce8986e3268e"} Dec 08 09:23:13 crc kubenswrapper[4776]: I1208 09:23:13.975086 4776 generic.go:334] "Generic (PLEG): container finished" podID="001c9704-cf27-4a31-8a61-3e5ce2e272eb" containerID="9fe6f989e54710e3f545abe625646a7377c16128fb3ee99a4312085f9bc6cc7b" exitCode=0 Dec 08 09:23:13 crc kubenswrapper[4776]: I1208 09:23:13.975157 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-9lrkl" event={"ID":"001c9704-cf27-4a31-8a61-3e5ce2e272eb","Type":"ContainerDied","Data":"9fe6f989e54710e3f545abe625646a7377c16128fb3ee99a4312085f9bc6cc7b"} Dec 08 09:23:13 crc kubenswrapper[4776]: I1208 09:23:13.987623 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6047fbdd-490d-4da5-ac61-58c28c7aa66c","Type":"ContainerStarted","Data":"c7f8e39f63069acaa5b38aadd14ee4f2b28616071ea662e84a1384172dd10fe6"} Dec 08 09:23:13 crc kubenswrapper[4776]: I1208 09:23:13.987985 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6047fbdd-490d-4da5-ac61-58c28c7aa66c","Type":"ContainerStarted","Data":"393fbc9ce23f3da8f4942b4c971b1b8098af680ee665fda9cca9120ff563c87c"} Dec 08 09:23:13 crc kubenswrapper[4776]: I1208 09:23:13.990722 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-775f79cd-lq4qd" event={"ID":"84196301-9fa2-4acb-9a49-d87fdb571dfe","Type":"ContainerStarted","Data":"daea7072fefcc23dac165f473d3f4cb1359947754ada16949fc11a82d8c6acac"} Dec 08 09:23:13 crc kubenswrapper[4776]: I1208 09:23:13.990864 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-775f79cd-lq4qd" event={"ID":"84196301-9fa2-4acb-9a49-d87fdb571dfe","Type":"ContainerStarted","Data":"b6e086130db2fd5957987439f44ccd67cfe0b9761b9f5b4c3f382d70be798431"} Dec 08 09:23:13 crc kubenswrapper[4776]: I1208 09:23:13.992125 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.043968 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-775f79cd-lq4qd" podStartSLOduration=5.043949904 podStartE2EDuration="5.043949904s" podCreationTimestamp="2025-12-08 09:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:14.030914034 +0000 UTC m=+1470.294139066" watchObservedRunningTime="2025-12-08 09:23:14.043949904 +0000 UTC m=+1470.307174926" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.449241 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.513151 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.698218 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5fd69d7-r446k"] Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.700160 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.701862 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.702088 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.749761 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5fd69d7-r446k"] Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.888788 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-config\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.888872 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-internal-tls-certs\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.888922 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-httpd-config\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.888979 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnb7q\" (UniqueName: \"kubernetes.io/projected/f84e2e46-bb9f-4b55-afd1-683f365c5417-kube-api-access-wnb7q\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.888998 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-ovndb-tls-certs\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.889015 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-public-tls-certs\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.889092 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-combined-ca-bundle\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.991088 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-combined-ca-bundle\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.991206 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-config\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.991253 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-internal-tls-certs\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.991298 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-httpd-config\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.991353 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnb7q\" (UniqueName: \"kubernetes.io/projected/f84e2e46-bb9f-4b55-afd1-683f365c5417-kube-api-access-wnb7q\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.991374 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-ovndb-tls-certs\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:14 crc kubenswrapper[4776]: I1208 09:23:14.991389 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-public-tls-certs\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.011472 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f46e578a-f471-4430-b7a1-095d9d295e2f","Type":"ContainerStarted","Data":"7c78049ebe3960a62fc3dddd323f61e860c19cb972414e3775dd58236c6f6259"} Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.012208 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f46e578a-f471-4430-b7a1-095d9d295e2f","Type":"ContainerStarted","Data":"7a1a0a69cc07ddc7c2a0dd0eefec87ba8c4022d12b7f3b009677b04aa0ef0358"} Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.012228 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f46e578a-f471-4430-b7a1-095d9d295e2f" containerName="glance-httpd" containerID="cri-o://7c78049ebe3960a62fc3dddd323f61e860c19cb972414e3775dd58236c6f6259" gracePeriod=30 Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.011775 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f46e578a-f471-4430-b7a1-095d9d295e2f" containerName="glance-log" containerID="cri-o://7a1a0a69cc07ddc7c2a0dd0eefec87ba8c4022d12b7f3b009677b04aa0ef0358" gracePeriod=30 Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.015207 4776 generic.go:334] "Generic (PLEG): container finished" podID="913d2881-9323-4503-b364-05de889fd095" containerID="e1e837bebb9bb5b35dabc32043d3992098ffa114681c7c6d190559d35cb0ab70" exitCode=0 Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.015337 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ns7rc" event={"ID":"913d2881-9323-4503-b364-05de889fd095","Type":"ContainerDied","Data":"e1e837bebb9bb5b35dabc32043d3992098ffa114681c7c6d190559d35cb0ab70"} Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.017112 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-public-tls-certs\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.018776 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-httpd-config\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.019283 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-config\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.019796 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-9lrkl" event={"ID":"001c9704-cf27-4a31-8a61-3e5ce2e272eb","Type":"ContainerStarted","Data":"6ac391ca9825822a2c218f1f30e3333d2154ab78fce7cfd6da43a7327afe0a0d"} Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.020699 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.024709 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6047fbdd-490d-4da5-ac61-58c28c7aa66c","Type":"ContainerStarted","Data":"016444857ae23105a9f384ccf00f95aa8bef76e204cf26929ccc92716629124d"} Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.025101 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6047fbdd-490d-4da5-ac61-58c28c7aa66c" containerName="glance-log" containerID="cri-o://c7f8e39f63069acaa5b38aadd14ee4f2b28616071ea662e84a1384172dd10fe6" gracePeriod=30 Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.025317 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6047fbdd-490d-4da5-ac61-58c28c7aa66c" containerName="glance-httpd" containerID="cri-o://016444857ae23105a9f384ccf00f95aa8bef76e204cf26929ccc92716629124d" gracePeriod=30 Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.033831 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-internal-tls-certs\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.034004 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-combined-ca-bundle\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.034196 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnb7q\" (UniqueName: \"kubernetes.io/projected/f84e2e46-bb9f-4b55-afd1-683f365c5417-kube-api-access-wnb7q\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.055067 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.055048793 podStartE2EDuration="6.055048793s" podCreationTimestamp="2025-12-08 09:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:15.05160864 +0000 UTC m=+1471.314833662" watchObservedRunningTime="2025-12-08 09:23:15.055048793 +0000 UTC m=+1471.318273815" Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.059526 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84e2e46-bb9f-4b55-afd1-683f365c5417-ovndb-tls-certs\") pod \"neutron-5fd69d7-r446k\" (UID: \"f84e2e46-bb9f-4b55-afd1-683f365c5417\") " pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.060008 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.102968 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.102949958 podStartE2EDuration="6.102949958s" podCreationTimestamp="2025-12-08 09:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:15.0888642 +0000 UTC m=+1471.352089222" watchObservedRunningTime="2025-12-08 09:23:15.102949958 +0000 UTC m=+1471.366174980" Dec 08 09:23:15 crc kubenswrapper[4776]: I1208 09:23:15.117350 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fb745b69-9lrkl" podStartSLOduration=6.117335604 podStartE2EDuration="6.117335604s" podCreationTimestamp="2025-12-08 09:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:15.115285339 +0000 UTC m=+1471.378510361" watchObservedRunningTime="2025-12-08 09:23:15.117335604 +0000 UTC m=+1471.380560626" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.053953 4776 generic.go:334] "Generic (PLEG): container finished" podID="f46e578a-f471-4430-b7a1-095d9d295e2f" containerID="7c78049ebe3960a62fc3dddd323f61e860c19cb972414e3775dd58236c6f6259" exitCode=143 Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.054045 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f46e578a-f471-4430-b7a1-095d9d295e2f","Type":"ContainerDied","Data":"7c78049ebe3960a62fc3dddd323f61e860c19cb972414e3775dd58236c6f6259"} Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.056020 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f46e578a-f471-4430-b7a1-095d9d295e2f","Type":"ContainerDied","Data":"7a1a0a69cc07ddc7c2a0dd0eefec87ba8c4022d12b7f3b009677b04aa0ef0358"} Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.056262 4776 generic.go:334] "Generic (PLEG): container finished" podID="f46e578a-f471-4430-b7a1-095d9d295e2f" containerID="7a1a0a69cc07ddc7c2a0dd0eefec87ba8c4022d12b7f3b009677b04aa0ef0358" exitCode=143 Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.061939 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cb640491-a8e7-4f8d-b4bb-1d0124f5727f","Type":"ContainerStarted","Data":"79a25671e82595ed518d52926a53051583b59d85827855e4133c686f91354556"} Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.077102 4776 generic.go:334] "Generic (PLEG): container finished" podID="6047fbdd-490d-4da5-ac61-58c28c7aa66c" containerID="016444857ae23105a9f384ccf00f95aa8bef76e204cf26929ccc92716629124d" exitCode=0 Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.077286 4776 generic.go:334] "Generic (PLEG): container finished" podID="6047fbdd-490d-4da5-ac61-58c28c7aa66c" containerID="c7f8e39f63069acaa5b38aadd14ee4f2b28616071ea662e84a1384172dd10fe6" exitCode=143 Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.077589 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6047fbdd-490d-4da5-ac61-58c28c7aa66c","Type":"ContainerDied","Data":"016444857ae23105a9f384ccf00f95aa8bef76e204cf26929ccc92716629124d"} Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.077708 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6047fbdd-490d-4da5-ac61-58c28c7aa66c","Type":"ContainerDied","Data":"c7f8e39f63069acaa5b38aadd14ee4f2b28616071ea662e84a1384172dd10fe6"} Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.237092 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.239090 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-config-data\") pod \"f46e578a-f471-4430-b7a1-095d9d295e2f\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.239146 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f46e578a-f471-4430-b7a1-095d9d295e2f-httpd-run\") pod \"f46e578a-f471-4430-b7a1-095d9d295e2f\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.239213 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f46e578a-f471-4430-b7a1-095d9d295e2f-logs\") pod \"f46e578a-f471-4430-b7a1-095d9d295e2f\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.239286 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fnxm\" (UniqueName: \"kubernetes.io/projected/f46e578a-f471-4430-b7a1-095d9d295e2f-kube-api-access-9fnxm\") pod \"f46e578a-f471-4430-b7a1-095d9d295e2f\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.239311 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-combined-ca-bundle\") pod \"f46e578a-f471-4430-b7a1-095d9d295e2f\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.239326 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"f46e578a-f471-4430-b7a1-095d9d295e2f\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.239342 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-scripts\") pod \"f46e578a-f471-4430-b7a1-095d9d295e2f\" (UID: \"f46e578a-f471-4430-b7a1-095d9d295e2f\") " Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.241249 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f46e578a-f471-4430-b7a1-095d9d295e2f-logs" (OuterVolumeSpecName: "logs") pod "f46e578a-f471-4430-b7a1-095d9d295e2f" (UID: "f46e578a-f471-4430-b7a1-095d9d295e2f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.241439 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f46e578a-f471-4430-b7a1-095d9d295e2f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f46e578a-f471-4430-b7a1-095d9d295e2f" (UID: "f46e578a-f471-4430-b7a1-095d9d295e2f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.245188 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "f46e578a-f471-4430-b7a1-095d9d295e2f" (UID: "f46e578a-f471-4430-b7a1-095d9d295e2f"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.248820 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f46e578a-f471-4430-b7a1-095d9d295e2f-kube-api-access-9fnxm" (OuterVolumeSpecName: "kube-api-access-9fnxm") pod "f46e578a-f471-4430-b7a1-095d9d295e2f" (UID: "f46e578a-f471-4430-b7a1-095d9d295e2f"). InnerVolumeSpecName "kube-api-access-9fnxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.251309 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-scripts" (OuterVolumeSpecName: "scripts") pod "f46e578a-f471-4430-b7a1-095d9d295e2f" (UID: "f46e578a-f471-4430-b7a1-095d9d295e2f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.295164 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f46e578a-f471-4430-b7a1-095d9d295e2f" (UID: "f46e578a-f471-4430-b7a1-095d9d295e2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.326899 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-config-data" (OuterVolumeSpecName: "config-data") pod "f46e578a-f471-4430-b7a1-095d9d295e2f" (UID: "f46e578a-f471-4430-b7a1-095d9d295e2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.342129 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.342156 4776 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f46e578a-f471-4430-b7a1-095d9d295e2f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.342169 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f46e578a-f471-4430-b7a1-095d9d295e2f-logs\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.342211 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fnxm\" (UniqueName: \"kubernetes.io/projected/f46e578a-f471-4430-b7a1-095d9d295e2f-kube-api-access-9fnxm\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.342224 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.342261 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.342271 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f46e578a-f471-4430-b7a1-095d9d295e2f-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.364068 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.380442 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.443693 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6047fbdd-490d-4da5-ac61-58c28c7aa66c-httpd-run\") pod \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.443785 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx875\" (UniqueName: \"kubernetes.io/projected/6047fbdd-490d-4da5-ac61-58c28c7aa66c-kube-api-access-lx875\") pod \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.443829 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.443897 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6047fbdd-490d-4da5-ac61-58c28c7aa66c-logs\") pod \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.443998 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-scripts\") pod \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.444038 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-combined-ca-bundle\") pod \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.444118 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-config-data\") pod \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\" (UID: \"6047fbdd-490d-4da5-ac61-58c28c7aa66c\") " Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.444601 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.451102 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6047fbdd-490d-4da5-ac61-58c28c7aa66c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6047fbdd-490d-4da5-ac61-58c28c7aa66c" (UID: "6047fbdd-490d-4da5-ac61-58c28c7aa66c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.451329 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6047fbdd-490d-4da5-ac61-58c28c7aa66c-logs" (OuterVolumeSpecName: "logs") pod "6047fbdd-490d-4da5-ac61-58c28c7aa66c" (UID: "6047fbdd-490d-4da5-ac61-58c28c7aa66c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.454440 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-scripts" (OuterVolumeSpecName: "scripts") pod "6047fbdd-490d-4da5-ac61-58c28c7aa66c" (UID: "6047fbdd-490d-4da5-ac61-58c28c7aa66c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.454158 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "6047fbdd-490d-4da5-ac61-58c28c7aa66c" (UID: "6047fbdd-490d-4da5-ac61-58c28c7aa66c"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.457132 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6047fbdd-490d-4da5-ac61-58c28c7aa66c-kube-api-access-lx875" (OuterVolumeSpecName: "kube-api-access-lx875") pod "6047fbdd-490d-4da5-ac61-58c28c7aa66c" (UID: "6047fbdd-490d-4da5-ac61-58c28c7aa66c"). InnerVolumeSpecName "kube-api-access-lx875". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.498281 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6047fbdd-490d-4da5-ac61-58c28c7aa66c" (UID: "6047fbdd-490d-4da5-ac61-58c28c7aa66c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.547042 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.547081 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.547093 4776 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6047fbdd-490d-4da5-ac61-58c28c7aa66c-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.547105 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx875\" (UniqueName: \"kubernetes.io/projected/6047fbdd-490d-4da5-ac61-58c28c7aa66c-kube-api-access-lx875\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.547131 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.547143 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6047fbdd-490d-4da5-ac61-58c28c7aa66c-logs\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.567603 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5fd69d7-r446k"] Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.624668 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.648954 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.655733 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-config-data" (OuterVolumeSpecName: "config-data") pod "6047fbdd-490d-4da5-ac61-58c28c7aa66c" (UID: "6047fbdd-490d-4da5-ac61-58c28c7aa66c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.751000 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6047fbdd-490d-4da5-ac61-58c28c7aa66c-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:16 crc kubenswrapper[4776]: I1208 09:23:16.924211 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.059199 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-fernet-keys\") pod \"913d2881-9323-4503-b364-05de889fd095\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.059816 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fq2d\" (UniqueName: \"kubernetes.io/projected/913d2881-9323-4503-b364-05de889fd095-kube-api-access-8fq2d\") pod \"913d2881-9323-4503-b364-05de889fd095\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.059894 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-combined-ca-bundle\") pod \"913d2881-9323-4503-b364-05de889fd095\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.059953 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-scripts\") pod \"913d2881-9323-4503-b364-05de889fd095\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.060048 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-config-data\") pod \"913d2881-9323-4503-b364-05de889fd095\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.060403 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-credential-keys\") pod \"913d2881-9323-4503-b364-05de889fd095\" (UID: \"913d2881-9323-4503-b364-05de889fd095\") " Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.068147 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/913d2881-9323-4503-b364-05de889fd095-kube-api-access-8fq2d" (OuterVolumeSpecName: "kube-api-access-8fq2d") pod "913d2881-9323-4503-b364-05de889fd095" (UID: "913d2881-9323-4503-b364-05de889fd095"). InnerVolumeSpecName "kube-api-access-8fq2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.070059 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "913d2881-9323-4503-b364-05de889fd095" (UID: "913d2881-9323-4503-b364-05de889fd095"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.071116 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-scripts" (OuterVolumeSpecName: "scripts") pod "913d2881-9323-4503-b364-05de889fd095" (UID: "913d2881-9323-4503-b364-05de889fd095"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.071315 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "913d2881-9323-4503-b364-05de889fd095" (UID: "913d2881-9323-4503-b364-05de889fd095"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.097411 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "913d2881-9323-4503-b364-05de889fd095" (UID: "913d2881-9323-4503-b364-05de889fd095"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.115074 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6047fbdd-490d-4da5-ac61-58c28c7aa66c","Type":"ContainerDied","Data":"393fbc9ce23f3da8f4942b4c971b1b8098af680ee665fda9cca9120ff563c87c"} Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.115122 4776 scope.go:117] "RemoveContainer" containerID="016444857ae23105a9f384ccf00f95aa8bef76e204cf26929ccc92716629124d" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.115250 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.117381 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-config-data" (OuterVolumeSpecName: "config-data") pod "913d2881-9323-4503-b364-05de889fd095" (UID: "913d2881-9323-4503-b364-05de889fd095"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.134412 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f46e578a-f471-4430-b7a1-095d9d295e2f","Type":"ContainerDied","Data":"b70d0b1d2c70f2ac5787fb538ec58490637d8ff7ab5a04cfbe8070d26f154b04"} Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.134528 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.139842 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fd69d7-r446k" event={"ID":"f84e2e46-bb9f-4b55-afd1-683f365c5417","Type":"ContainerStarted","Data":"393c035c806244ee8904fade9b56ff8ee0bfa6aa685ea46aa3cd7b0a21aa76b5"} Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.142820 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ns7rc" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.143060 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ns7rc" event={"ID":"913d2881-9323-4503-b364-05de889fd095","Type":"ContainerDied","Data":"44926e60f286229e0328f2084263b89c80c36aa4f67695a2f6e07e07667a89cc"} Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.143086 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44926e60f286229e0328f2084263b89c80c36aa4f67695a2f6e07e07667a89cc" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.165027 4776 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.165056 4776 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.165068 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fq2d\" (UniqueName: \"kubernetes.io/projected/913d2881-9323-4503-b364-05de889fd095-kube-api-access-8fq2d\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.165087 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.165099 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.165113 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/913d2881-9323-4503-b364-05de889fd095-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.173981 4776 scope.go:117] "RemoveContainer" containerID="c7f8e39f63069acaa5b38aadd14ee4f2b28616071ea662e84a1384172dd10fe6" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.188944 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cb640491-a8e7-4f8d-b4bb-1d0124f5727f","Type":"ContainerStarted","Data":"0d566e5db7f2345e8e625958067aef3d77449de834cb581bafd51d4698e1f1f6"} Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.188991 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cb640491-a8e7-4f8d-b4bb-1d0124f5727f","Type":"ContainerStarted","Data":"66f4413fc43936e3e6e972ff172d4ecde6f4adaf4b86e43deba14a123654126d"} Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.189003 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cb640491-a8e7-4f8d-b4bb-1d0124f5727f","Type":"ContainerStarted","Data":"5e0da7473f2d8be41f53475c42f6e4c70f32ac385c5387107565e74df88fe8a1"} Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.236038 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.251018 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.302129 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.317524 4776 scope.go:117] "RemoveContainer" containerID="7c78049ebe3960a62fc3dddd323f61e860c19cb972414e3775dd58236c6f6259" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.341230 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.352807 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 09:23:17 crc kubenswrapper[4776]: E1208 09:23:17.353381 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46e578a-f471-4430-b7a1-095d9d295e2f" containerName="glance-httpd" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.353405 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46e578a-f471-4430-b7a1-095d9d295e2f" containerName="glance-httpd" Dec 08 09:23:17 crc kubenswrapper[4776]: E1208 09:23:17.353439 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46e578a-f471-4430-b7a1-095d9d295e2f" containerName="glance-log" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.353448 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46e578a-f471-4430-b7a1-095d9d295e2f" containerName="glance-log" Dec 08 09:23:17 crc kubenswrapper[4776]: E1208 09:23:17.353470 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6047fbdd-490d-4da5-ac61-58c28c7aa66c" containerName="glance-log" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.353477 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6047fbdd-490d-4da5-ac61-58c28c7aa66c" containerName="glance-log" Dec 08 09:23:17 crc kubenswrapper[4776]: E1208 09:23:17.353500 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6047fbdd-490d-4da5-ac61-58c28c7aa66c" containerName="glance-httpd" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.353507 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6047fbdd-490d-4da5-ac61-58c28c7aa66c" containerName="glance-httpd" Dec 08 09:23:17 crc kubenswrapper[4776]: E1208 09:23:17.353519 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913d2881-9323-4503-b364-05de889fd095" containerName="keystone-bootstrap" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.353528 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="913d2881-9323-4503-b364-05de889fd095" containerName="keystone-bootstrap" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.353723 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f46e578a-f471-4430-b7a1-095d9d295e2f" containerName="glance-httpd" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.353743 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f46e578a-f471-4430-b7a1-095d9d295e2f" containerName="glance-log" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.353756 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6047fbdd-490d-4da5-ac61-58c28c7aa66c" containerName="glance-httpd" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.353771 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6047fbdd-490d-4da5-ac61-58c28c7aa66c" containerName="glance-log" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.353779 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="913d2881-9323-4503-b364-05de889fd095" containerName="keystone-bootstrap" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.359892 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.366511 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.368581 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.381798 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.381832 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.381917 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hszn2" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.382142 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.382207 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.382295 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.394772 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.430229 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.441738 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-77496dd4f7-8gxmg"] Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.443399 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.444864 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.445117 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.445368 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.445527 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wz4bj" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.445742 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.445884 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.450981 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-77496dd4f7-8gxmg"] Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.473671 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.473715 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.473738 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.473909 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.474019 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b2pg\" (UniqueName: \"kubernetes.io/projected/cd19b615-0c04-4fd9-968c-dceb17256b34-kube-api-access-8b2pg\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.474058 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-logs\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.474075 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd19b615-0c04-4fd9-968c-dceb17256b34-logs\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.474102 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.474315 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmzxr\" (UniqueName: \"kubernetes.io/projected/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-kube-api-access-qmzxr\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.474405 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.474488 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd19b615-0c04-4fd9-968c-dceb17256b34-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.474622 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.474667 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.474706 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.474771 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.474794 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.493369 4776 scope.go:117] "RemoveContainer" containerID="7a1a0a69cc07ddc7c2a0dd0eefec87ba8c4022d12b7f3b009677b04aa0ef0358" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.577020 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.577094 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-internal-tls-certs\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.577127 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b2pg\" (UniqueName: \"kubernetes.io/projected/cd19b615-0c04-4fd9-968c-dceb17256b34-kube-api-access-8b2pg\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.577149 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-logs\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.577181 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd19b615-0c04-4fd9-968c-dceb17256b34-logs\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.577692 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.578308 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-logs\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.578356 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-fernet-keys\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.578383 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.578503 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-credential-keys\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.578641 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmzxr\" (UniqueName: \"kubernetes.io/projected/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-kube-api-access-qmzxr\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.578682 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tx46\" (UniqueName: \"kubernetes.io/projected/6207b5a9-d7b8-4302-876c-c2a84bb352a1-kube-api-access-9tx46\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.578716 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-combined-ca-bundle\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.578740 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.578769 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-scripts\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.578793 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd19b615-0c04-4fd9-968c-dceb17256b34-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.578837 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.578855 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.578879 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.578907 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.578963 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.578998 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.579021 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-public-tls-certs\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.579042 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.579060 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.579080 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-config-data\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.580602 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.581212 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.581342 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd19b615-0c04-4fd9-968c-dceb17256b34-logs\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.591465 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd19b615-0c04-4fd9-968c-dceb17256b34-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.593054 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.594666 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.599287 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.599779 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.601527 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.602063 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.605810 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b2pg\" (UniqueName: \"kubernetes.io/projected/cd19b615-0c04-4fd9-968c-dceb17256b34-kube-api-access-8b2pg\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.607961 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmzxr\" (UniqueName: \"kubernetes.io/projected/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-kube-api-access-qmzxr\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.608621 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.610011 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.680751 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-internal-tls-certs\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.680821 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-fernet-keys\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.680885 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-credential-keys\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.680910 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tx46\" (UniqueName: \"kubernetes.io/projected/6207b5a9-d7b8-4302-876c-c2a84bb352a1-kube-api-access-9tx46\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.680935 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-combined-ca-bundle\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.680967 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-scripts\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.681052 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-public-tls-certs\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.681075 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-config-data\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.689460 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-scripts\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.693351 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-credential-keys\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.693627 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-fernet-keys\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.694338 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-combined-ca-bundle\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.696457 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-internal-tls-certs\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.699078 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-config-data\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.700518 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6207b5a9-d7b8-4302-876c-c2a84bb352a1-public-tls-certs\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.713408 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " pod="openstack/glance-default-external-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.719740 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tx46\" (UniqueName: \"kubernetes.io/projected/6207b5a9-d7b8-4302-876c-c2a84bb352a1-kube-api-access-9tx46\") pod \"keystone-77496dd4f7-8gxmg\" (UID: \"6207b5a9-d7b8-4302-876c-c2a84bb352a1\") " pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.721153 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:23:17 crc kubenswrapper[4776]: I1208 09:23:17.757740 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:18 crc kubenswrapper[4776]: I1208 09:23:18.001113 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 09:23:18 crc kubenswrapper[4776]: I1208 09:23:18.014004 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 09:23:18 crc kubenswrapper[4776]: I1208 09:23:18.213701 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fd69d7-r446k" event={"ID":"f84e2e46-bb9f-4b55-afd1-683f365c5417","Type":"ContainerStarted","Data":"2d2c3f2e25ff753bfabded6c6c1ac8239d358c10809cd2ab5766f33ea3eea034"} Dec 08 09:23:18 crc kubenswrapper[4776]: I1208 09:23:18.214023 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fd69d7-r446k" event={"ID":"f84e2e46-bb9f-4b55-afd1-683f365c5417","Type":"ContainerStarted","Data":"bf9c3b5c1a19d0344ec0b8938d975cce74c0f0694530e35493b96a0f88866d3b"} Dec 08 09:23:18 crc kubenswrapper[4776]: I1208 09:23:18.215614 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:18 crc kubenswrapper[4776]: I1208 09:23:18.218412 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xksjb" event={"ID":"6d7c64ff-eec0-48d3-bba8-724158787096","Type":"ContainerStarted","Data":"29fd9693fc3d83cc2e5ce55a3d87e9bcb1804c6861d8e11c13aa0af1ba9823d9"} Dec 08 09:23:18 crc kubenswrapper[4776]: I1208 09:23:18.271040 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5fd69d7-r446k" podStartSLOduration=4.271020542 podStartE2EDuration="4.271020542s" podCreationTimestamp="2025-12-08 09:23:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:18.265924375 +0000 UTC m=+1474.529149397" watchObservedRunningTime="2025-12-08 09:23:18.271020542 +0000 UTC m=+1474.534245564" Dec 08 09:23:18 crc kubenswrapper[4776]: I1208 09:23:18.287071 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-xksjb" podStartSLOduration=4.289739218 podStartE2EDuration="41.287052851s" podCreationTimestamp="2025-12-08 09:22:37 +0000 UTC" firstStartedPulling="2025-12-08 09:22:38.875954744 +0000 UTC m=+1435.139179766" lastFinishedPulling="2025-12-08 09:23:15.873268377 +0000 UTC m=+1472.136493399" observedRunningTime="2025-12-08 09:23:18.279621162 +0000 UTC m=+1474.542846184" watchObservedRunningTime="2025-12-08 09:23:18.287052851 +0000 UTC m=+1474.550277873" Dec 08 09:23:18 crc kubenswrapper[4776]: I1208 09:23:18.364576 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6047fbdd-490d-4da5-ac61-58c28c7aa66c" path="/var/lib/kubelet/pods/6047fbdd-490d-4da5-ac61-58c28c7aa66c/volumes" Dec 08 09:23:18 crc kubenswrapper[4776]: I1208 09:23:18.365424 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f46e578a-f471-4430-b7a1-095d9d295e2f" path="/var/lib/kubelet/pods/f46e578a-f471-4430-b7a1-095d9d295e2f/volumes" Dec 08 09:23:18 crc kubenswrapper[4776]: I1208 09:23:18.394441 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-77496dd4f7-8gxmg"] Dec 08 09:23:19 crc kubenswrapper[4776]: I1208 09:23:19.879300 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:19 crc kubenswrapper[4776]: I1208 09:23:19.935419 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-4zwkd"] Dec 08 09:23:19 crc kubenswrapper[4776]: I1208 09:23:19.935674 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" podUID="0918e3b0-3fba-4bd7-b0fc-dae247fd9417" containerName="dnsmasq-dns" containerID="cri-o://80c72e2eaf16d04fcdc06df7263b5d1c033cb5c847b653aa35f5e103863a0442" gracePeriod=10 Dec 08 09:23:20 crc kubenswrapper[4776]: I1208 09:23:20.245441 4776 generic.go:334] "Generic (PLEG): container finished" podID="0918e3b0-3fba-4bd7-b0fc-dae247fd9417" containerID="80c72e2eaf16d04fcdc06df7263b5d1c033cb5c847b653aa35f5e103863a0442" exitCode=0 Dec 08 09:23:20 crc kubenswrapper[4776]: I1208 09:23:20.245552 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" event={"ID":"0918e3b0-3fba-4bd7-b0fc-dae247fd9417","Type":"ContainerDied","Data":"80c72e2eaf16d04fcdc06df7263b5d1c033cb5c847b653aa35f5e103863a0442"} Dec 08 09:23:20 crc kubenswrapper[4776]: I1208 09:23:20.259379 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4a95eb4-92c1-4eff-940b-37f74dd3dc18" containerID="8833fa7626a819ff1247acadca4b9b2a9355a6d206fca9d0ed67d2f0cae0998b" exitCode=0 Dec 08 09:23:20 crc kubenswrapper[4776]: I1208 09:23:20.259439 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zhhw6" event={"ID":"f4a95eb4-92c1-4eff-940b-37f74dd3dc18","Type":"ContainerDied","Data":"8833fa7626a819ff1247acadca4b9b2a9355a6d206fca9d0ed67d2f0cae0998b"} Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.054115 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zhhw6" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.069704 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.124801 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 09:23:22 crc kubenswrapper[4776]: W1208 09:23:22.127287 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c7f76a4_e7c3_4534_ad0a_69ea872fa9d0.slice/crio-2cb3fde0c23ccea5ad93577a7da59eddd9809012304cc3df660f7425236846c3 WatchSource:0}: Error finding container 2cb3fde0c23ccea5ad93577a7da59eddd9809012304cc3df660f7425236846c3: Status 404 returned error can't find the container with id 2cb3fde0c23ccea5ad93577a7da59eddd9809012304cc3df660f7425236846c3 Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.202066 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-combined-ca-bundle\") pod \"f4a95eb4-92c1-4eff-940b-37f74dd3dc18\" (UID: \"f4a95eb4-92c1-4eff-940b-37f74dd3dc18\") " Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.205389 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-ovsdbserver-nb\") pod \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.205443 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-ovsdbserver-sb\") pod \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.205501 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-dns-svc\") pod \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.205564 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txtz4\" (UniqueName: \"kubernetes.io/projected/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-kube-api-access-txtz4\") pod \"f4a95eb4-92c1-4eff-940b-37f74dd3dc18\" (UID: \"f4a95eb4-92c1-4eff-940b-37f74dd3dc18\") " Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.205598 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-config-data\") pod \"f4a95eb4-92c1-4eff-940b-37f74dd3dc18\" (UID: \"f4a95eb4-92c1-4eff-940b-37f74dd3dc18\") " Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.205739 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsjk6\" (UniqueName: \"kubernetes.io/projected/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-kube-api-access-lsjk6\") pod \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.205772 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-config\") pod \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\" (UID: \"0918e3b0-3fba-4bd7-b0fc-dae247fd9417\") " Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.241536 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-kube-api-access-txtz4" (OuterVolumeSpecName: "kube-api-access-txtz4") pod "f4a95eb4-92c1-4eff-940b-37f74dd3dc18" (UID: "f4a95eb4-92c1-4eff-940b-37f74dd3dc18"). InnerVolumeSpecName "kube-api-access-txtz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.258432 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-kube-api-access-lsjk6" (OuterVolumeSpecName: "kube-api-access-lsjk6") pod "0918e3b0-3fba-4bd7-b0fc-dae247fd9417" (UID: "0918e3b0-3fba-4bd7-b0fc-dae247fd9417"). InnerVolumeSpecName "kube-api-access-lsjk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.280435 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" event={"ID":"0918e3b0-3fba-4bd7-b0fc-dae247fd9417","Type":"ContainerDied","Data":"bd2571fae1e8eb5cd93f7d4c94ce91e774f8375814ebf2e1ed80fa11e449c2b0"} Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.280496 4776 scope.go:117] "RemoveContainer" containerID="80c72e2eaf16d04fcdc06df7263b5d1c033cb5c847b653aa35f5e103863a0442" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.280520 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-4zwkd" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.282095 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-77496dd4f7-8gxmg" event={"ID":"6207b5a9-d7b8-4302-876c-c2a84bb352a1","Type":"ContainerStarted","Data":"17f1dd20dbd7a88c0546a4316fd6fb64885f8933b09b04c01fc28e524c28196c"} Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.283274 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0","Type":"ContainerStarted","Data":"2cb3fde0c23ccea5ad93577a7da59eddd9809012304cc3df660f7425236846c3"} Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.284590 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zhhw6" event={"ID":"f4a95eb4-92c1-4eff-940b-37f74dd3dc18","Type":"ContainerDied","Data":"fa17e8c3df5cbab401a0e22eb3a2961ccec569ee0742f4db79265ae8496978a0"} Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.284633 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa17e8c3df5cbab401a0e22eb3a2961ccec569ee0742f4db79265ae8496978a0" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.284681 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zhhw6" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.344695 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsjk6\" (UniqueName: \"kubernetes.io/projected/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-kube-api-access-lsjk6\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.345112 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txtz4\" (UniqueName: \"kubernetes.io/projected/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-kube-api-access-txtz4\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.355443 4776 scope.go:117] "RemoveContainer" containerID="fb905a93aa19d41b54b74fc9b9442af2c9871a86fbd701ef2976329e07c82896" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.455825 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4a95eb4-92c1-4eff-940b-37f74dd3dc18" (UID: "f4a95eb4-92c1-4eff-940b-37f74dd3dc18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.464498 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0918e3b0-3fba-4bd7-b0fc-dae247fd9417" (UID: "0918e3b0-3fba-4bd7-b0fc-dae247fd9417"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.516426 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-config" (OuterVolumeSpecName: "config") pod "0918e3b0-3fba-4bd7-b0fc-dae247fd9417" (UID: "0918e3b0-3fba-4bd7-b0fc-dae247fd9417"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.517611 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0918e3b0-3fba-4bd7-b0fc-dae247fd9417" (UID: "0918e3b0-3fba-4bd7-b0fc-dae247fd9417"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.527634 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0918e3b0-3fba-4bd7-b0fc-dae247fd9417" (UID: "0918e3b0-3fba-4bd7-b0fc-dae247fd9417"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.549785 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.549817 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.549826 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.549835 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.549843 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0918e3b0-3fba-4bd7-b0fc-dae247fd9417-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.559894 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-config-data" (OuterVolumeSpecName: "config-data") pod "f4a95eb4-92c1-4eff-940b-37f74dd3dc18" (UID: "f4a95eb4-92c1-4eff-940b-37f74dd3dc18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:22 crc kubenswrapper[4776]: W1208 09:23:22.610784 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd19b615_0c04_4fd9_968c_dceb17256b34.slice/crio-472711f1218e52f1d5eb139a050b8e3b93a5a484b0306bb71630afd4e047488a WatchSource:0}: Error finding container 472711f1218e52f1d5eb139a050b8e3b93a5a484b0306bb71630afd4e047488a: Status 404 returned error can't find the container with id 472711f1218e52f1d5eb139a050b8e3b93a5a484b0306bb71630afd4e047488a Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.652095 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4a95eb4-92c1-4eff-940b-37f74dd3dc18-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.760743 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.824536 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-4zwkd"] Dec 08 09:23:22 crc kubenswrapper[4776]: I1208 09:23:22.859500 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-4zwkd"] Dec 08 09:23:23 crc kubenswrapper[4776]: I1208 09:23:23.300249 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd19b615-0c04-4fd9-968c-dceb17256b34","Type":"ContainerStarted","Data":"837a93d35bb23bc5d540f3124197ef476a37dccdb8f26d3ac00a99933f14c4d7"} Dec 08 09:23:23 crc kubenswrapper[4776]: I1208 09:23:23.300299 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd19b615-0c04-4fd9-968c-dceb17256b34","Type":"ContainerStarted","Data":"472711f1218e52f1d5eb139a050b8e3b93a5a484b0306bb71630afd4e047488a"} Dec 08 09:23:23 crc kubenswrapper[4776]: I1208 09:23:23.303556 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fab03865-02a6-4cd2-bf78-22ed25534301","Type":"ContainerStarted","Data":"33df0a20f59c354f005d663c841d3064254836a3cc1ba39fa5380890fba6d42d"} Dec 08 09:23:23 crc kubenswrapper[4776]: I1208 09:23:23.305249 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2s7n9" event={"ID":"9dff1e28-5d80-48af-b348-cfd6080d3e37","Type":"ContainerStarted","Data":"c6fadcb238ee53d36512092bee0f72b706c31a0875b3785872e7aac1b82a72da"} Dec 08 09:23:23 crc kubenswrapper[4776]: I1208 09:23:23.309008 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-77496dd4f7-8gxmg" event={"ID":"6207b5a9-d7b8-4302-876c-c2a84bb352a1","Type":"ContainerStarted","Data":"8f07e749684668ceb1cdb20071d553e44b0416ef4a38c8cdf955c94711bf1338"} Dec 08 09:23:23 crc kubenswrapper[4776]: I1208 09:23:23.309878 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:23 crc kubenswrapper[4776]: I1208 09:23:23.330771 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0","Type":"ContainerStarted","Data":"66a083d4a17692f828c222f667901a8c63cbc10353a50fcabcdc1b39b5aadae0"} Dec 08 09:23:23 crc kubenswrapper[4776]: I1208 09:23:23.338130 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2s7n9" podStartSLOduration=3.7397743329999997 podStartE2EDuration="46.338107617s" podCreationTimestamp="2025-12-08 09:22:37 +0000 UTC" firstStartedPulling="2025-12-08 09:22:39.450760707 +0000 UTC m=+1435.713985729" lastFinishedPulling="2025-12-08 09:23:22.049093991 +0000 UTC m=+1478.312319013" observedRunningTime="2025-12-08 09:23:23.326501365 +0000 UTC m=+1479.589726387" watchObservedRunningTime="2025-12-08 09:23:23.338107617 +0000 UTC m=+1479.601332639" Dec 08 09:23:23 crc kubenswrapper[4776]: I1208 09:23:23.353888 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-626nj" event={"ID":"7c962dc3-3c64-4b5d-a740-a790a5fa10f9","Type":"ContainerStarted","Data":"11d7debca6d75173ba761b687f25eb76e43517200472766b13c3590c38eeb450"} Dec 08 09:23:23 crc kubenswrapper[4776]: I1208 09:23:23.359754 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-77496dd4f7-8gxmg" podStartSLOduration=6.359707716 podStartE2EDuration="6.359707716s" podCreationTimestamp="2025-12-08 09:23:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:23.349253046 +0000 UTC m=+1479.612478068" watchObservedRunningTime="2025-12-08 09:23:23.359707716 +0000 UTC m=+1479.622932738" Dec 08 09:23:23 crc kubenswrapper[4776]: I1208 09:23:23.364809 4776 generic.go:334] "Generic (PLEG): container finished" podID="6d7c64ff-eec0-48d3-bba8-724158787096" containerID="29fd9693fc3d83cc2e5ce55a3d87e9bcb1804c6861d8e11c13aa0af1ba9823d9" exitCode=0 Dec 08 09:23:23 crc kubenswrapper[4776]: I1208 09:23:23.364885 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xksjb" event={"ID":"6d7c64ff-eec0-48d3-bba8-724158787096","Type":"ContainerDied","Data":"29fd9693fc3d83cc2e5ce55a3d87e9bcb1804c6861d8e11c13aa0af1ba9823d9"} Dec 08 09:23:23 crc kubenswrapper[4776]: I1208 09:23:23.389475 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-626nj" podStartSLOduration=3.793648979 podStartE2EDuration="46.389457795s" podCreationTimestamp="2025-12-08 09:22:37 +0000 UTC" firstStartedPulling="2025-12-08 09:22:39.448310261 +0000 UTC m=+1435.711535283" lastFinishedPulling="2025-12-08 09:23:22.044119067 +0000 UTC m=+1478.307344099" observedRunningTime="2025-12-08 09:23:23.377606896 +0000 UTC m=+1479.640831918" watchObservedRunningTime="2025-12-08 09:23:23.389457795 +0000 UTC m=+1479.652682817" Dec 08 09:23:23 crc kubenswrapper[4776]: I1208 09:23:23.403772 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cb640491-a8e7-4f8d-b4bb-1d0124f5727f","Type":"ContainerStarted","Data":"a75ed350f68b657ed3cff2532af4d3b4e6e65b5fe769df242a426b940aab7892"} Dec 08 09:23:23 crc kubenswrapper[4776]: I1208 09:23:23.404012 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cb640491-a8e7-4f8d-b4bb-1d0124f5727f","Type":"ContainerStarted","Data":"3ceca6b93263e5a66a721fab7ed0f5fbd63f87b44bea37fbdabcc15b24e9564e"} Dec 08 09:23:24 crc kubenswrapper[4776]: I1208 09:23:24.360450 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0918e3b0-3fba-4bd7-b0fc-dae247fd9417" path="/var/lib/kubelet/pods/0918e3b0-3fba-4bd7-b0fc-dae247fd9417/volumes" Dec 08 09:23:24 crc kubenswrapper[4776]: I1208 09:23:24.443825 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cb640491-a8e7-4f8d-b4bb-1d0124f5727f","Type":"ContainerStarted","Data":"e70610190e7cd4ad949c5038b2315f6d3287932ebde71c84aeec5c164b40ca80"} Dec 08 09:23:24 crc kubenswrapper[4776]: I1208 09:23:24.827908 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xksjb" Dec 08 09:23:24 crc kubenswrapper[4776]: I1208 09:23:24.942607 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8z6p\" (UniqueName: \"kubernetes.io/projected/6d7c64ff-eec0-48d3-bba8-724158787096-kube-api-access-x8z6p\") pod \"6d7c64ff-eec0-48d3-bba8-724158787096\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " Dec 08 09:23:24 crc kubenswrapper[4776]: I1208 09:23:24.943190 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d7c64ff-eec0-48d3-bba8-724158787096-etc-machine-id\") pod \"6d7c64ff-eec0-48d3-bba8-724158787096\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " Dec 08 09:23:24 crc kubenswrapper[4776]: I1208 09:23:24.943252 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-combined-ca-bundle\") pod \"6d7c64ff-eec0-48d3-bba8-724158787096\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " Dec 08 09:23:24 crc kubenswrapper[4776]: I1208 09:23:24.943319 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-db-sync-config-data\") pod \"6d7c64ff-eec0-48d3-bba8-724158787096\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " Dec 08 09:23:24 crc kubenswrapper[4776]: I1208 09:23:24.943348 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-config-data\") pod \"6d7c64ff-eec0-48d3-bba8-724158787096\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " Dec 08 09:23:24 crc kubenswrapper[4776]: I1208 09:23:24.943364 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-scripts\") pod \"6d7c64ff-eec0-48d3-bba8-724158787096\" (UID: \"6d7c64ff-eec0-48d3-bba8-724158787096\") " Dec 08 09:23:24 crc kubenswrapper[4776]: I1208 09:23:24.943336 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d7c64ff-eec0-48d3-bba8-724158787096-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6d7c64ff-eec0-48d3-bba8-724158787096" (UID: "6d7c64ff-eec0-48d3-bba8-724158787096"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:23:24 crc kubenswrapper[4776]: I1208 09:23:24.944598 4776 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d7c64ff-eec0-48d3-bba8-724158787096-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:24 crc kubenswrapper[4776]: I1208 09:23:24.950011 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-scripts" (OuterVolumeSpecName: "scripts") pod "6d7c64ff-eec0-48d3-bba8-724158787096" (UID: "6d7c64ff-eec0-48d3-bba8-724158787096"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:24 crc kubenswrapper[4776]: I1208 09:23:24.951268 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6d7c64ff-eec0-48d3-bba8-724158787096" (UID: "6d7c64ff-eec0-48d3-bba8-724158787096"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:24 crc kubenswrapper[4776]: I1208 09:23:24.951333 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d7c64ff-eec0-48d3-bba8-724158787096-kube-api-access-x8z6p" (OuterVolumeSpecName: "kube-api-access-x8z6p") pod "6d7c64ff-eec0-48d3-bba8-724158787096" (UID: "6d7c64ff-eec0-48d3-bba8-724158787096"). InnerVolumeSpecName "kube-api-access-x8z6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.000275 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d7c64ff-eec0-48d3-bba8-724158787096" (UID: "6d7c64ff-eec0-48d3-bba8-724158787096"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.044056 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-config-data" (OuterVolumeSpecName: "config-data") pod "6d7c64ff-eec0-48d3-bba8-724158787096" (UID: "6d7c64ff-eec0-48d3-bba8-724158787096"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.047034 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8z6p\" (UniqueName: \"kubernetes.io/projected/6d7c64ff-eec0-48d3-bba8-724158787096-kube-api-access-x8z6p\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.047069 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.047079 4776 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.047097 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.047116 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d7c64ff-eec0-48d3-bba8-724158787096-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.478798 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cb640491-a8e7-4f8d-b4bb-1d0124f5727f","Type":"ContainerStarted","Data":"9a8e848e33ed48bc171c50a87bb9efe3c6668ae4dba2038e61db9fb11e6d4c26"} Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.478868 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cb640491-a8e7-4f8d-b4bb-1d0124f5727f","Type":"ContainerStarted","Data":"15ba259236ee0e1324c2d495b4a1c2d0dad1d76e9a2c2de84ad6d7a8daab3e77"} Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.478878 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cb640491-a8e7-4f8d-b4bb-1d0124f5727f","Type":"ContainerStarted","Data":"abdf0d76fca99963b0dd18a76eda7502a9eb6c350cceead301ab5d724232b5cd"} Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.482513 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd19b615-0c04-4fd9-968c-dceb17256b34","Type":"ContainerStarted","Data":"81217682482f4d408f75b1fcb5a2c470b36a8dbb841bba1d1db1e01277eabbc3"} Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.489723 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0","Type":"ContainerStarted","Data":"ee9ecef0b47f7d54e0dbe04ffcfe752ccb4a3af5961fd3824353e635925dcff3"} Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.495646 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xksjb" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.495833 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xksjb" event={"ID":"6d7c64ff-eec0-48d3-bba8-724158787096","Type":"ContainerDied","Data":"45e9d04376a91e26f6d2b35bbc0678ed858e8ddf53e97216124bc6349d102798"} Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.495893 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45e9d04376a91e26f6d2b35bbc0678ed858e8ddf53e97216124bc6349d102798" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.508486 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.50846658 podStartE2EDuration="8.50846658s" podCreationTimestamp="2025-12-08 09:23:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:25.504371161 +0000 UTC m=+1481.767596193" watchObservedRunningTime="2025-12-08 09:23:25.50846658 +0000 UTC m=+1481.771691602" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.540211 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.540188091 podStartE2EDuration="8.540188091s" podCreationTimestamp="2025-12-08 09:23:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:25.527307236 +0000 UTC m=+1481.790532258" watchObservedRunningTime="2025-12-08 09:23:25.540188091 +0000 UTC m=+1481.803413113" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.749938 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 09:23:25 crc kubenswrapper[4776]: E1208 09:23:25.750928 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0918e3b0-3fba-4bd7-b0fc-dae247fd9417" containerName="dnsmasq-dns" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.750944 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0918e3b0-3fba-4bd7-b0fc-dae247fd9417" containerName="dnsmasq-dns" Dec 08 09:23:25 crc kubenswrapper[4776]: E1208 09:23:25.750966 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a95eb4-92c1-4eff-940b-37f74dd3dc18" containerName="heat-db-sync" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.750973 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a95eb4-92c1-4eff-940b-37f74dd3dc18" containerName="heat-db-sync" Dec 08 09:23:25 crc kubenswrapper[4776]: E1208 09:23:25.750997 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0918e3b0-3fba-4bd7-b0fc-dae247fd9417" containerName="init" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.751008 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0918e3b0-3fba-4bd7-b0fc-dae247fd9417" containerName="init" Dec 08 09:23:25 crc kubenswrapper[4776]: E1208 09:23:25.751025 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d7c64ff-eec0-48d3-bba8-724158787096" containerName="cinder-db-sync" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.751033 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d7c64ff-eec0-48d3-bba8-724158787096" containerName="cinder-db-sync" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.761082 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d7c64ff-eec0-48d3-bba8-724158787096" containerName="cinder-db-sync" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.761154 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0918e3b0-3fba-4bd7-b0fc-dae247fd9417" containerName="dnsmasq-dns" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.761186 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a95eb4-92c1-4eff-940b-37f74dd3dc18" containerName="heat-db-sync" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.769548 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.780589 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fvv68" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.780852 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.780992 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.781110 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.831864 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.873466 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.873527 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1094f09-518a-45fc-b0d5-b204ddf8ec85-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.873602 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-scripts\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.873633 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.873680 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5nbt\" (UniqueName: \"kubernetes.io/projected/e1094f09-518a-45fc-b0d5-b204ddf8ec85-kube-api-access-g5nbt\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.873730 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-config-data\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.910723 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-699896b9f7-knsr4"] Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.914407 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.928937 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699896b9f7-knsr4"] Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.977017 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.977142 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5nbt\" (UniqueName: \"kubernetes.io/projected/e1094f09-518a-45fc-b0d5-b204ddf8ec85-kube-api-access-g5nbt\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.977232 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-config-data\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.977370 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.977841 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1094f09-518a-45fc-b0d5-b204ddf8ec85-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.978740 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-scripts\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.980703 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1094f09-518a-45fc-b0d5-b204ddf8ec85-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:25 crc kubenswrapper[4776]: I1208 09:23:25.994932 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-scripts\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.000233 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.004656 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.005508 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-config-data\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.010490 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.012155 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.030295 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5nbt\" (UniqueName: \"kubernetes.io/projected/e1094f09-518a-45fc-b0d5-b204ddf8ec85-kube-api-access-g5nbt\") pod \"cinder-scheduler-0\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.030647 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.046548 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.080814 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7r87\" (UniqueName: \"kubernetes.io/projected/00331a6c-dea4-4b2a-b421-51e04b02fd91-kube-api-access-m7r87\") pod \"dnsmasq-dns-699896b9f7-knsr4\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.081245 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-ovsdbserver-sb\") pod \"dnsmasq-dns-699896b9f7-knsr4\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.081263 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-config\") pod \"dnsmasq-dns-699896b9f7-knsr4\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.081376 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-ovsdbserver-nb\") pod \"dnsmasq-dns-699896b9f7-knsr4\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.081396 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-dns-svc\") pod \"dnsmasq-dns-699896b9f7-knsr4\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.148991 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.183710 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f88f901-e706-4892-aa2a-48a97c28a699-logs\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.183769 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.183808 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-config-data-custom\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.183838 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f88f901-e706-4892-aa2a-48a97c28a699-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.184185 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-ovsdbserver-nb\") pod \"dnsmasq-dns-699896b9f7-knsr4\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.184239 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-dns-svc\") pod \"dnsmasq-dns-699896b9f7-knsr4\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.184284 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-scripts\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.184361 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-config-data\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.184590 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5lgb\" (UniqueName: \"kubernetes.io/projected/8f88f901-e706-4892-aa2a-48a97c28a699-kube-api-access-q5lgb\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.184619 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7r87\" (UniqueName: \"kubernetes.io/projected/00331a6c-dea4-4b2a-b421-51e04b02fd91-kube-api-access-m7r87\") pod \"dnsmasq-dns-699896b9f7-knsr4\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.184676 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-ovsdbserver-sb\") pod \"dnsmasq-dns-699896b9f7-knsr4\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.184696 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-config\") pod \"dnsmasq-dns-699896b9f7-knsr4\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.185891 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-config\") pod \"dnsmasq-dns-699896b9f7-knsr4\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.187475 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-ovsdbserver-sb\") pod \"dnsmasq-dns-699896b9f7-knsr4\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.187840 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-dns-svc\") pod \"dnsmasq-dns-699896b9f7-knsr4\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.188190 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-ovsdbserver-nb\") pod \"dnsmasq-dns-699896b9f7-knsr4\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.215851 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7r87\" (UniqueName: \"kubernetes.io/projected/00331a6c-dea4-4b2a-b421-51e04b02fd91-kube-api-access-m7r87\") pod \"dnsmasq-dns-699896b9f7-knsr4\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.259918 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.288318 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-config-data\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.288473 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5lgb\" (UniqueName: \"kubernetes.io/projected/8f88f901-e706-4892-aa2a-48a97c28a699-kube-api-access-q5lgb\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.288684 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f88f901-e706-4892-aa2a-48a97c28a699-logs\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.288741 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.288762 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-config-data-custom\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.288922 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f88f901-e706-4892-aa2a-48a97c28a699-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.289085 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-scripts\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.289120 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f88f901-e706-4892-aa2a-48a97c28a699-logs\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.289225 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f88f901-e706-4892-aa2a-48a97c28a699-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.293498 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.293495 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-scripts\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.294634 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-config-data-custom\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.295750 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-config-data\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.315006 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5lgb\" (UniqueName: \"kubernetes.io/projected/8f88f901-e706-4892-aa2a-48a97c28a699-kube-api-access-q5lgb\") pod \"cinder-api-0\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.427378 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.594493 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cb640491-a8e7-4f8d-b4bb-1d0124f5727f","Type":"ContainerStarted","Data":"f8fb228d065867c4e8ebacc2bb3b2687b19d131e3f4fa80747a3264d4bff909a"} Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.644962 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=39.385007384 podStartE2EDuration="1m17.644941614s" podCreationTimestamp="2025-12-08 09:22:09 +0000 UTC" firstStartedPulling="2025-12-08 09:22:43.872814836 +0000 UTC m=+1440.136039858" lastFinishedPulling="2025-12-08 09:23:22.132749066 +0000 UTC m=+1478.395974088" observedRunningTime="2025-12-08 09:23:26.636315342 +0000 UTC m=+1482.899540364" watchObservedRunningTime="2025-12-08 09:23:26.644941614 +0000 UTC m=+1482.908166636" Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.694902 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 09:23:26 crc kubenswrapper[4776]: I1208 09:23:26.874146 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699896b9f7-knsr4"] Dec 08 09:23:26 crc kubenswrapper[4776]: W1208 09:23:26.891798 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00331a6c_dea4_4b2a_b421_51e04b02fd91.slice/crio-5e5297c9d2bbfa9f08f377a6286a0e21543c93c13371df933919802454882fd0 WatchSource:0}: Error finding container 5e5297c9d2bbfa9f08f377a6286a0e21543c93c13371df933919802454882fd0: Status 404 returned error can't find the container with id 5e5297c9d2bbfa9f08f377a6286a0e21543c93c13371df933919802454882fd0 Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.020701 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699896b9f7-knsr4"] Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.058468 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b895b5785-9ljbr"] Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.061878 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.069231 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.074363 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-9ljbr"] Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.129472 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-dns-svc\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.129526 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-ovsdbserver-nb\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.129570 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-dns-swift-storage-0\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.129593 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2s8g\" (UniqueName: \"kubernetes.io/projected/c968f892-a097-4b6d-885d-8a7b849714a3-kube-api-access-v2s8g\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.129726 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-config\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.129751 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-ovsdbserver-sb\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.170709 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 08 09:23:27 crc kubenswrapper[4776]: W1208 09:23:27.172942 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f88f901_e706_4892_aa2a_48a97c28a699.slice/crio-208f5cbf07a076a4511458bc65162a21690159a96f54085dc6f850c10989918f WatchSource:0}: Error finding container 208f5cbf07a076a4511458bc65162a21690159a96f54085dc6f850c10989918f: Status 404 returned error can't find the container with id 208f5cbf07a076a4511458bc65162a21690159a96f54085dc6f850c10989918f Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.232163 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-config\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.232249 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-ovsdbserver-sb\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.232297 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-dns-svc\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.232324 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-ovsdbserver-nb\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.232361 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-dns-swift-storage-0\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.232382 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2s8g\" (UniqueName: \"kubernetes.io/projected/c968f892-a097-4b6d-885d-8a7b849714a3-kube-api-access-v2s8g\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.233261 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-ovsdbserver-sb\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.233525 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-config\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.233841 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-dns-svc\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.234406 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-dns-swift-storage-0\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.234859 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-ovsdbserver-nb\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.256322 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2s8g\" (UniqueName: \"kubernetes.io/projected/c968f892-a097-4b6d-885d-8a7b849714a3-kube-api-access-v2s8g\") pod \"dnsmasq-dns-b895b5785-9ljbr\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.466287 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.609115 4776 generic.go:334] "Generic (PLEG): container finished" podID="7c962dc3-3c64-4b5d-a740-a790a5fa10f9" containerID="11d7debca6d75173ba761b687f25eb76e43517200472766b13c3590c38eeb450" exitCode=0 Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.609209 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-626nj" event={"ID":"7c962dc3-3c64-4b5d-a740-a790a5fa10f9","Type":"ContainerDied","Data":"11d7debca6d75173ba761b687f25eb76e43517200472766b13c3590c38eeb450"} Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.610630 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1094f09-518a-45fc-b0d5-b204ddf8ec85","Type":"ContainerStarted","Data":"9409d8efba8d68b6b1539f819b411409a1a82c6e1cab4775c77b9339d8b5726d"} Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.617601 4776 generic.go:334] "Generic (PLEG): container finished" podID="00331a6c-dea4-4b2a-b421-51e04b02fd91" containerID="fe35118a3af2f669783a81ec52379a5889ef756d1b196259de15d30b08ee77f7" exitCode=0 Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.617819 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699896b9f7-knsr4" event={"ID":"00331a6c-dea4-4b2a-b421-51e04b02fd91","Type":"ContainerDied","Data":"fe35118a3af2f669783a81ec52379a5889ef756d1b196259de15d30b08ee77f7"} Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.617855 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699896b9f7-knsr4" event={"ID":"00331a6c-dea4-4b2a-b421-51e04b02fd91","Type":"ContainerStarted","Data":"5e5297c9d2bbfa9f08f377a6286a0e21543c93c13371df933919802454882fd0"} Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.620164 4776 generic.go:334] "Generic (PLEG): container finished" podID="9dff1e28-5d80-48af-b348-cfd6080d3e37" containerID="c6fadcb238ee53d36512092bee0f72b706c31a0875b3785872e7aac1b82a72da" exitCode=0 Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.620232 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2s7n9" event={"ID":"9dff1e28-5d80-48af-b348-cfd6080d3e37","Type":"ContainerDied","Data":"c6fadcb238ee53d36512092bee0f72b706c31a0875b3785872e7aac1b82a72da"} Dec 08 09:23:27 crc kubenswrapper[4776]: I1208 09:23:27.623977 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f88f901-e706-4892-aa2a-48a97c28a699","Type":"ContainerStarted","Data":"208f5cbf07a076a4511458bc65162a21690159a96f54085dc6f850c10989918f"} Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.002411 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.002694 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.021468 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.021511 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.049837 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.095010 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.130246 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-9ljbr"] Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.210974 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.383553 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.384786 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.436310 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.496897 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-dns-svc\") pod \"00331a6c-dea4-4b2a-b421-51e04b02fd91\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.496954 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-ovsdbserver-sb\") pod \"00331a6c-dea4-4b2a-b421-51e04b02fd91\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.497023 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-config\") pod \"00331a6c-dea4-4b2a-b421-51e04b02fd91\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.497151 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7r87\" (UniqueName: \"kubernetes.io/projected/00331a6c-dea4-4b2a-b421-51e04b02fd91-kube-api-access-m7r87\") pod \"00331a6c-dea4-4b2a-b421-51e04b02fd91\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.497315 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-ovsdbserver-nb\") pod \"00331a6c-dea4-4b2a-b421-51e04b02fd91\" (UID: \"00331a6c-dea4-4b2a-b421-51e04b02fd91\") " Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.508654 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00331a6c-dea4-4b2a-b421-51e04b02fd91-kube-api-access-m7r87" (OuterVolumeSpecName: "kube-api-access-m7r87") pod "00331a6c-dea4-4b2a-b421-51e04b02fd91" (UID: "00331a6c-dea4-4b2a-b421-51e04b02fd91"). InnerVolumeSpecName "kube-api-access-m7r87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.563864 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "00331a6c-dea4-4b2a-b421-51e04b02fd91" (UID: "00331a6c-dea4-4b2a-b421-51e04b02fd91"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.609449 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.609509 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7r87\" (UniqueName: \"kubernetes.io/projected/00331a6c-dea4-4b2a-b421-51e04b02fd91-kube-api-access-m7r87\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.612194 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00331a6c-dea4-4b2a-b421-51e04b02fd91" (UID: "00331a6c-dea4-4b2a-b421-51e04b02fd91"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.641023 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699896b9f7-knsr4" event={"ID":"00331a6c-dea4-4b2a-b421-51e04b02fd91","Type":"ContainerDied","Data":"5e5297c9d2bbfa9f08f377a6286a0e21543c93c13371df933919802454882fd0"} Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.641427 4776 scope.go:117] "RemoveContainer" containerID="fe35118a3af2f669783a81ec52379a5889ef756d1b196259de15d30b08ee77f7" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.641604 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699896b9f7-knsr4" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.647386 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-9ljbr" event={"ID":"c968f892-a097-4b6d-885d-8a7b849714a3","Type":"ContainerStarted","Data":"931c6360e9af654b42f7c49ce37a38385acc9783e0c6b8a3d6c1468115c14f83"} Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.652310 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f88f901-e706-4892-aa2a-48a97c28a699","Type":"ContainerStarted","Data":"c22380e48b4e8549b299b571cd0e4721ac50057cdad3d69c654f575cdd1eb07e"} Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.652586 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.652615 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.652625 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.652652 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.665255 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-config" (OuterVolumeSpecName: "config") pod "00331a6c-dea4-4b2a-b421-51e04b02fd91" (UID: "00331a6c-dea4-4b2a-b421-51e04b02fd91"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.677448 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "00331a6c-dea4-4b2a-b421-51e04b02fd91" (UID: "00331a6c-dea4-4b2a-b421-51e04b02fd91"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.712154 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.712209 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:28 crc kubenswrapper[4776]: I1208 09:23:28.712218 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00331a6c-dea4-4b2a-b421-51e04b02fd91-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.101470 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699896b9f7-knsr4"] Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.123346 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-699896b9f7-knsr4"] Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.242350 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2s7n9" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.278466 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-626nj" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.325110 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-combined-ca-bundle\") pod \"7c962dc3-3c64-4b5d-a740-a790a5fa10f9\" (UID: \"7c962dc3-3c64-4b5d-a740-a790a5fa10f9\") " Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.325256 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dff1e28-5d80-48af-b348-cfd6080d3e37-logs\") pod \"9dff1e28-5d80-48af-b348-cfd6080d3e37\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.325308 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-combined-ca-bundle\") pod \"9dff1e28-5d80-48af-b348-cfd6080d3e37\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.325338 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wpcz\" (UniqueName: \"kubernetes.io/projected/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-kube-api-access-2wpcz\") pod \"7c962dc3-3c64-4b5d-a740-a790a5fa10f9\" (UID: \"7c962dc3-3c64-4b5d-a740-a790a5fa10f9\") " Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.325369 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwnsq\" (UniqueName: \"kubernetes.io/projected/9dff1e28-5d80-48af-b348-cfd6080d3e37-kube-api-access-vwnsq\") pod \"9dff1e28-5d80-48af-b348-cfd6080d3e37\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.325394 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-scripts\") pod \"9dff1e28-5d80-48af-b348-cfd6080d3e37\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.325452 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-config-data\") pod \"9dff1e28-5d80-48af-b348-cfd6080d3e37\" (UID: \"9dff1e28-5d80-48af-b348-cfd6080d3e37\") " Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.325487 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-db-sync-config-data\") pod \"7c962dc3-3c64-4b5d-a740-a790a5fa10f9\" (UID: \"7c962dc3-3c64-4b5d-a740-a790a5fa10f9\") " Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.325859 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dff1e28-5d80-48af-b348-cfd6080d3e37-logs" (OuterVolumeSpecName: "logs") pod "9dff1e28-5d80-48af-b348-cfd6080d3e37" (UID: "9dff1e28-5d80-48af-b348-cfd6080d3e37"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.326493 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dff1e28-5d80-48af-b348-cfd6080d3e37-logs\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.332970 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-scripts" (OuterVolumeSpecName: "scripts") pod "9dff1e28-5d80-48af-b348-cfd6080d3e37" (UID: "9dff1e28-5d80-48af-b348-cfd6080d3e37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.333738 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7c962dc3-3c64-4b5d-a740-a790a5fa10f9" (UID: "7c962dc3-3c64-4b5d-a740-a790a5fa10f9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.346191 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dff1e28-5d80-48af-b348-cfd6080d3e37-kube-api-access-vwnsq" (OuterVolumeSpecName: "kube-api-access-vwnsq") pod "9dff1e28-5d80-48af-b348-cfd6080d3e37" (UID: "9dff1e28-5d80-48af-b348-cfd6080d3e37"). InnerVolumeSpecName "kube-api-access-vwnsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.347070 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-kube-api-access-2wpcz" (OuterVolumeSpecName: "kube-api-access-2wpcz") pod "7c962dc3-3c64-4b5d-a740-a790a5fa10f9" (UID: "7c962dc3-3c64-4b5d-a740-a790a5fa10f9"). InnerVolumeSpecName "kube-api-access-2wpcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.364377 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-config-data" (OuterVolumeSpecName: "config-data") pod "9dff1e28-5d80-48af-b348-cfd6080d3e37" (UID: "9dff1e28-5d80-48af-b348-cfd6080d3e37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.369949 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c962dc3-3c64-4b5d-a740-a790a5fa10f9" (UID: "7c962dc3-3c64-4b5d-a740-a790a5fa10f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.379545 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9dff1e28-5d80-48af-b348-cfd6080d3e37" (UID: "9dff1e28-5d80-48af-b348-cfd6080d3e37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.428794 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.428835 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.428847 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wpcz\" (UniqueName: \"kubernetes.io/projected/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-kube-api-access-2wpcz\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.428862 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwnsq\" (UniqueName: \"kubernetes.io/projected/9dff1e28-5d80-48af-b348-cfd6080d3e37-kube-api-access-vwnsq\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.428874 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.428884 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dff1e28-5d80-48af-b348-cfd6080d3e37-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.428894 4776 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c962dc3-3c64-4b5d-a740-a790a5fa10f9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.672874 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1094f09-518a-45fc-b0d5-b204ddf8ec85","Type":"ContainerStarted","Data":"8dd0b38b6b507cba9588499d3306ebee7fa547826a872bdce4fd32cf701505f0"} Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.672918 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1094f09-518a-45fc-b0d5-b204ddf8ec85","Type":"ContainerStarted","Data":"f4ea81c1c16b42e732fc7c8826f147f7d03faf0335b9cdfcafbeb2c392206d3d"} Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.677064 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2s7n9" event={"ID":"9dff1e28-5d80-48af-b348-cfd6080d3e37","Type":"ContainerDied","Data":"3b7e52de8edb82ea85cdf139c7c9a8d2f7576c948f158819b9375584095b4669"} Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.677095 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b7e52de8edb82ea85cdf139c7c9a8d2f7576c948f158819b9375584095b4669" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.677104 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2s7n9" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.681307 4776 generic.go:334] "Generic (PLEG): container finished" podID="c968f892-a097-4b6d-885d-8a7b849714a3" containerID="13e13279c8a4dc18d16ebc1901d1421b00784ba9ab0675a562a63307ddb6f651" exitCode=0 Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.681352 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-9ljbr" event={"ID":"c968f892-a097-4b6d-885d-8a7b849714a3","Type":"ContainerDied","Data":"13e13279c8a4dc18d16ebc1901d1421b00784ba9ab0675a562a63307ddb6f651"} Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.691110 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f88f901-e706-4892-aa2a-48a97c28a699","Type":"ContainerStarted","Data":"de627a7daaf25c6012f0a1601130543b8d18df127d0afe55ac58892054ed5967"} Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.691193 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8f88f901-e706-4892-aa2a-48a97c28a699" containerName="cinder-api-log" containerID="cri-o://c22380e48b4e8549b299b571cd0e4721ac50057cdad3d69c654f575cdd1eb07e" gracePeriod=30 Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.691217 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.691270 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8f88f901-e706-4892-aa2a-48a97c28a699" containerName="cinder-api" containerID="cri-o://de627a7daaf25c6012f0a1601130543b8d18df127d0afe55ac58892054ed5967" gracePeriod=30 Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.696730 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-626nj" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.697235 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-626nj" event={"ID":"7c962dc3-3c64-4b5d-a740-a790a5fa10f9","Type":"ContainerDied","Data":"373778421c4908c11862aee506bb80d21b15e77f3a1d9cf5cb226c594ebe243c"} Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.697263 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="373778421c4908c11862aee506bb80d21b15e77f3a1d9cf5cb226c594ebe243c" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.739436 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.892882067 podStartE2EDuration="4.739419792s" podCreationTimestamp="2025-12-08 09:23:25 +0000 UTC" firstStartedPulling="2025-12-08 09:23:26.719603086 +0000 UTC m=+1482.982828108" lastFinishedPulling="2025-12-08 09:23:27.566140811 +0000 UTC m=+1483.829365833" observedRunningTime="2025-12-08 09:23:29.688990088 +0000 UTC m=+1485.952215110" watchObservedRunningTime="2025-12-08 09:23:29.739419792 +0000 UTC m=+1486.002644814" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.826780 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.826756566 podStartE2EDuration="4.826756566s" podCreationTimestamp="2025-12-08 09:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:29.729617459 +0000 UTC m=+1485.992842481" watchObservedRunningTime="2025-12-08 09:23:29.826756566 +0000 UTC m=+1486.089981588" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.946394 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-78d78f694b-ck9wf"] Dec 08 09:23:29 crc kubenswrapper[4776]: E1208 09:23:29.947113 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c962dc3-3c64-4b5d-a740-a790a5fa10f9" containerName="barbican-db-sync" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.947151 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c962dc3-3c64-4b5d-a740-a790a5fa10f9" containerName="barbican-db-sync" Dec 08 09:23:29 crc kubenswrapper[4776]: E1208 09:23:29.947208 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00331a6c-dea4-4b2a-b421-51e04b02fd91" containerName="init" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.947216 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="00331a6c-dea4-4b2a-b421-51e04b02fd91" containerName="init" Dec 08 09:23:29 crc kubenswrapper[4776]: E1208 09:23:29.947238 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dff1e28-5d80-48af-b348-cfd6080d3e37" containerName="placement-db-sync" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.947245 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dff1e28-5d80-48af-b348-cfd6080d3e37" containerName="placement-db-sync" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.947533 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dff1e28-5d80-48af-b348-cfd6080d3e37" containerName="placement-db-sync" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.947554 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c962dc3-3c64-4b5d-a740-a790a5fa10f9" containerName="barbican-db-sync" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.947597 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="00331a6c-dea4-4b2a-b421-51e04b02fd91" containerName="init" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.949232 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.954282 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.954447 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.965635 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tjbl6" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.975496 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-75876fb99b-xnbd7"] Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.977547 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.982201 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-64vhk" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.982362 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.982425 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.982546 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 08 09:23:29 crc kubenswrapper[4776]: I1208 09:23:29.982657 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:29.998218 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-664c575c59-ncvpr"] Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.000564 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.003214 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.019853 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78d78f694b-ck9wf"] Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.047013 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75876fb99b-xnbd7"] Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063391 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e501058f-25e0-456c-b23d-c7caafa729c3-combined-ca-bundle\") pod \"barbican-keystone-listener-78d78f694b-ck9wf\" (UID: \"e501058f-25e0-456c-b23d-c7caafa729c3\") " pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063448 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-scripts\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063464 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-logs\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063486 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-config-data\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063532 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-internal-tls-certs\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063555 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4d85\" (UniqueName: \"kubernetes.io/projected/d6258a3d-a50e-4cf4-af4d-e6f588d8744a-kube-api-access-l4d85\") pod \"barbican-worker-664c575c59-ncvpr\" (UID: \"d6258a3d-a50e-4cf4-af4d-e6f588d8744a\") " pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063582 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e501058f-25e0-456c-b23d-c7caafa729c3-config-data\") pod \"barbican-keystone-listener-78d78f694b-ck9wf\" (UID: \"e501058f-25e0-456c-b23d-c7caafa729c3\") " pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063623 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6258a3d-a50e-4cf4-af4d-e6f588d8744a-config-data\") pod \"barbican-worker-664c575c59-ncvpr\" (UID: \"d6258a3d-a50e-4cf4-af4d-e6f588d8744a\") " pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063645 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-public-tls-certs\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063673 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e501058f-25e0-456c-b23d-c7caafa729c3-logs\") pod \"barbican-keystone-listener-78d78f694b-ck9wf\" (UID: \"e501058f-25e0-456c-b23d-c7caafa729c3\") " pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063692 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6258a3d-a50e-4cf4-af4d-e6f588d8744a-config-data-custom\") pod \"barbican-worker-664c575c59-ncvpr\" (UID: \"d6258a3d-a50e-4cf4-af4d-e6f588d8744a\") " pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063714 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdxrz\" (UniqueName: \"kubernetes.io/projected/e501058f-25e0-456c-b23d-c7caafa729c3-kube-api-access-gdxrz\") pod \"barbican-keystone-listener-78d78f694b-ck9wf\" (UID: \"e501058f-25e0-456c-b23d-c7caafa729c3\") " pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063731 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e501058f-25e0-456c-b23d-c7caafa729c3-config-data-custom\") pod \"barbican-keystone-listener-78d78f694b-ck9wf\" (UID: \"e501058f-25e0-456c-b23d-c7caafa729c3\") " pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063745 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6258a3d-a50e-4cf4-af4d-e6f588d8744a-combined-ca-bundle\") pod \"barbican-worker-664c575c59-ncvpr\" (UID: \"d6258a3d-a50e-4cf4-af4d-e6f588d8744a\") " pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063763 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6258a3d-a50e-4cf4-af4d-e6f588d8744a-logs\") pod \"barbican-worker-664c575c59-ncvpr\" (UID: \"d6258a3d-a50e-4cf4-af4d-e6f588d8744a\") " pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063793 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4554\" (UniqueName: \"kubernetes.io/projected/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-kube-api-access-s4554\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063815 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-combined-ca-bundle\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.063895 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-664c575c59-ncvpr"] Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.168417 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-internal-tls-certs\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.168895 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4d85\" (UniqueName: \"kubernetes.io/projected/d6258a3d-a50e-4cf4-af4d-e6f588d8744a-kube-api-access-l4d85\") pod \"barbican-worker-664c575c59-ncvpr\" (UID: \"d6258a3d-a50e-4cf4-af4d-e6f588d8744a\") " pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.168931 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e501058f-25e0-456c-b23d-c7caafa729c3-config-data\") pod \"barbican-keystone-listener-78d78f694b-ck9wf\" (UID: \"e501058f-25e0-456c-b23d-c7caafa729c3\") " pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.168982 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6258a3d-a50e-4cf4-af4d-e6f588d8744a-config-data\") pod \"barbican-worker-664c575c59-ncvpr\" (UID: \"d6258a3d-a50e-4cf4-af4d-e6f588d8744a\") " pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.169005 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-public-tls-certs\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.169035 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e501058f-25e0-456c-b23d-c7caafa729c3-logs\") pod \"barbican-keystone-listener-78d78f694b-ck9wf\" (UID: \"e501058f-25e0-456c-b23d-c7caafa729c3\") " pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.169058 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6258a3d-a50e-4cf4-af4d-e6f588d8744a-config-data-custom\") pod \"barbican-worker-664c575c59-ncvpr\" (UID: \"d6258a3d-a50e-4cf4-af4d-e6f588d8744a\") " pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.169082 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdxrz\" (UniqueName: \"kubernetes.io/projected/e501058f-25e0-456c-b23d-c7caafa729c3-kube-api-access-gdxrz\") pod \"barbican-keystone-listener-78d78f694b-ck9wf\" (UID: \"e501058f-25e0-456c-b23d-c7caafa729c3\") " pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.169099 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6258a3d-a50e-4cf4-af4d-e6f588d8744a-combined-ca-bundle\") pod \"barbican-worker-664c575c59-ncvpr\" (UID: \"d6258a3d-a50e-4cf4-af4d-e6f588d8744a\") " pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.169113 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e501058f-25e0-456c-b23d-c7caafa729c3-config-data-custom\") pod \"barbican-keystone-listener-78d78f694b-ck9wf\" (UID: \"e501058f-25e0-456c-b23d-c7caafa729c3\") " pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.169136 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6258a3d-a50e-4cf4-af4d-e6f588d8744a-logs\") pod \"barbican-worker-664c575c59-ncvpr\" (UID: \"d6258a3d-a50e-4cf4-af4d-e6f588d8744a\") " pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.169185 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4554\" (UniqueName: \"kubernetes.io/projected/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-kube-api-access-s4554\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.169208 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-combined-ca-bundle\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.169271 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e501058f-25e0-456c-b23d-c7caafa729c3-combined-ca-bundle\") pod \"barbican-keystone-listener-78d78f694b-ck9wf\" (UID: \"e501058f-25e0-456c-b23d-c7caafa729c3\") " pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.169300 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-scripts\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.169316 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-logs\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.169339 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-config-data\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.179492 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e501058f-25e0-456c-b23d-c7caafa729c3-logs\") pod \"barbican-keystone-listener-78d78f694b-ck9wf\" (UID: \"e501058f-25e0-456c-b23d-c7caafa729c3\") " pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.187894 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6258a3d-a50e-4cf4-af4d-e6f588d8744a-logs\") pod \"barbican-worker-664c575c59-ncvpr\" (UID: \"d6258a3d-a50e-4cf4-af4d-e6f588d8744a\") " pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.190668 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-logs\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.199665 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-9ljbr"] Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.213114 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6258a3d-a50e-4cf4-af4d-e6f588d8744a-config-data-custom\") pod \"barbican-worker-664c575c59-ncvpr\" (UID: \"d6258a3d-a50e-4cf4-af4d-e6f588d8744a\") " pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.213715 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6258a3d-a50e-4cf4-af4d-e6f588d8744a-combined-ca-bundle\") pod \"barbican-worker-664c575c59-ncvpr\" (UID: \"d6258a3d-a50e-4cf4-af4d-e6f588d8744a\") " pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.216285 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-internal-tls-certs\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.216717 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-combined-ca-bundle\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.231623 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e501058f-25e0-456c-b23d-c7caafa729c3-config-data-custom\") pod \"barbican-keystone-listener-78d78f694b-ck9wf\" (UID: \"e501058f-25e0-456c-b23d-c7caafa729c3\") " pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.238244 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-public-tls-certs\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.238730 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-config-data\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.239195 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e501058f-25e0-456c-b23d-c7caafa729c3-combined-ca-bundle\") pod \"barbican-keystone-listener-78d78f694b-ck9wf\" (UID: \"e501058f-25e0-456c-b23d-c7caafa729c3\") " pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.240358 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e501058f-25e0-456c-b23d-c7caafa729c3-config-data\") pod \"barbican-keystone-listener-78d78f694b-ck9wf\" (UID: \"e501058f-25e0-456c-b23d-c7caafa729c3\") " pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.240673 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-scripts\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.257388 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lpj75"] Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.259294 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4554\" (UniqueName: \"kubernetes.io/projected/ae330c18-0140-4bc4-8503-cf6c3bbce3d8-kube-api-access-s4554\") pod \"placement-75876fb99b-xnbd7\" (UID: \"ae330c18-0140-4bc4-8503-cf6c3bbce3d8\") " pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.260460 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.267761 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lpj75"] Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.277317 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7ccc58565d-s4sc8"] Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.280580 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.281199 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6258a3d-a50e-4cf4-af4d-e6f588d8744a-config-data\") pod \"barbican-worker-664c575c59-ncvpr\" (UID: \"d6258a3d-a50e-4cf4-af4d-e6f588d8744a\") " pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.282591 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4d85\" (UniqueName: \"kubernetes.io/projected/d6258a3d-a50e-4cf4-af4d-e6f588d8744a-kube-api-access-l4d85\") pod \"barbican-worker-664c575c59-ncvpr\" (UID: \"d6258a3d-a50e-4cf4-af4d-e6f588d8744a\") " pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.283716 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.289356 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdxrz\" (UniqueName: \"kubernetes.io/projected/e501058f-25e0-456c-b23d-c7caafa729c3-kube-api-access-gdxrz\") pod \"barbican-keystone-listener-78d78f694b-ck9wf\" (UID: \"e501058f-25e0-456c-b23d-c7caafa729c3\") " pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.290072 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7ccc58565d-s4sc8"] Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.368279 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.381771 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.381852 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.381885 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.382301 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hffl9\" (UniqueName: \"kubernetes.io/projected/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-kube-api-access-hffl9\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.382486 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-config\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.382574 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k4bq\" (UniqueName: \"kubernetes.io/projected/f5472c33-1c77-4bde-a438-924aa6a53a78-kube-api-access-8k4bq\") pod \"barbican-api-7ccc58565d-s4sc8\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.382746 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-config-data-custom\") pod \"barbican-api-7ccc58565d-s4sc8\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.382819 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5472c33-1c77-4bde-a438-924aa6a53a78-logs\") pod \"barbican-api-7ccc58565d-s4sc8\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.382942 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-combined-ca-bundle\") pod \"barbican-api-7ccc58565d-s4sc8\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.383039 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-config-data\") pod \"barbican-api-7ccc58565d-s4sc8\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.383154 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.383651 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00331a6c-dea4-4b2a-b421-51e04b02fd91" path="/var/lib/kubelet/pods/00331a6c-dea4-4b2a-b421-51e04b02fd91/volumes" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.421993 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.484905 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.485556 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.485631 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hffl9\" (UniqueName: \"kubernetes.io/projected/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-kube-api-access-hffl9\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.485720 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-config\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.485744 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k4bq\" (UniqueName: \"kubernetes.io/projected/f5472c33-1c77-4bde-a438-924aa6a53a78-kube-api-access-8k4bq\") pod \"barbican-api-7ccc58565d-s4sc8\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.485787 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-config-data-custom\") pod \"barbican-api-7ccc58565d-s4sc8\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.485802 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5472c33-1c77-4bde-a438-924aa6a53a78-logs\") pod \"barbican-api-7ccc58565d-s4sc8\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.485846 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-combined-ca-bundle\") pod \"barbican-api-7ccc58565d-s4sc8\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.486382 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.487741 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5472c33-1c77-4bde-a438-924aa6a53a78-logs\") pod \"barbican-api-7ccc58565d-s4sc8\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.488233 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-config-data\") pod \"barbican-api-7ccc58565d-s4sc8\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.488453 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.488553 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.489824 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.490330 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-664c575c59-ncvpr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.490725 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-config\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.490730 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.490867 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.495314 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-combined-ca-bundle\") pod \"barbican-api-7ccc58565d-s4sc8\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.495556 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-config-data\") pod \"barbican-api-7ccc58565d-s4sc8\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.495747 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-config-data-custom\") pod \"barbican-api-7ccc58565d-s4sc8\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.504887 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hffl9\" (UniqueName: \"kubernetes.io/projected/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-kube-api-access-hffl9\") pod \"dnsmasq-dns-5c9776ccc5-lpj75\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.509573 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.515998 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k4bq\" (UniqueName: \"kubernetes.io/projected/f5472c33-1c77-4bde-a438-924aa6a53a78-kube-api-access-8k4bq\") pod \"barbican-api-7ccc58565d-s4sc8\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.520157 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.727256 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-9ljbr" event={"ID":"c968f892-a097-4b6d-885d-8a7b849714a3","Type":"ContainerStarted","Data":"475fd31bb672d52517df75eba98ad986f7428d516a94113935e951d100bdc9e9"} Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.727691 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b895b5785-9ljbr" podUID="c968f892-a097-4b6d-885d-8a7b849714a3" containerName="dnsmasq-dns" containerID="cri-o://475fd31bb672d52517df75eba98ad986f7428d516a94113935e951d100bdc9e9" gracePeriod=10 Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.727981 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.767626 4776 generic.go:334] "Generic (PLEG): container finished" podID="8f88f901-e706-4892-aa2a-48a97c28a699" containerID="c22380e48b4e8549b299b571cd0e4721ac50057cdad3d69c654f575cdd1eb07e" exitCode=143 Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.767681 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f88f901-e706-4892-aa2a-48a97c28a699","Type":"ContainerDied","Data":"c22380e48b4e8549b299b571cd0e4721ac50057cdad3d69c654f575cdd1eb07e"} Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.768773 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 09:23:30 crc kubenswrapper[4776]: I1208 09:23:30.779089 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b895b5785-9ljbr" podStartSLOduration=3.779066077 podStartE2EDuration="3.779066077s" podCreationTimestamp="2025-12-08 09:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:30.762275006 +0000 UTC m=+1487.025500028" watchObservedRunningTime="2025-12-08 09:23:30.779066077 +0000 UTC m=+1487.042291099" Dec 08 09:23:31 crc kubenswrapper[4776]: I1208 09:23:31.149662 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 08 09:23:31 crc kubenswrapper[4776]: I1208 09:23:31.230186 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78d78f694b-ck9wf"] Dec 08 09:23:31 crc kubenswrapper[4776]: I1208 09:23:31.244250 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75876fb99b-xnbd7"] Dec 08 09:23:31 crc kubenswrapper[4776]: I1208 09:23:31.500543 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-664c575c59-ncvpr"] Dec 08 09:23:31 crc kubenswrapper[4776]: I1208 09:23:31.796878 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" event={"ID":"e501058f-25e0-456c-b23d-c7caafa729c3","Type":"ContainerStarted","Data":"6a74eb576924bea7f2443bfb6c7d18db66b40b3b93a991e90f549a7db82f7fb5"} Dec 08 09:23:31 crc kubenswrapper[4776]: I1208 09:23:31.809086 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-664c575c59-ncvpr" event={"ID":"d6258a3d-a50e-4cf4-af4d-e6f588d8744a","Type":"ContainerStarted","Data":"08f5d487b8529ee784b135d3c99526b661ce4d73e946915b1553fcf5ba355c57"} Dec 08 09:23:31 crc kubenswrapper[4776]: I1208 09:23:31.813918 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75876fb99b-xnbd7" event={"ID":"ae330c18-0140-4bc4-8503-cf6c3bbce3d8","Type":"ContainerStarted","Data":"1882f55244bbe96a18495dd33a81de15f81ab7f0fd38cbba6072a8d5cce90ce8"} Dec 08 09:23:31 crc kubenswrapper[4776]: I1208 09:23:31.813964 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75876fb99b-xnbd7" event={"ID":"ae330c18-0140-4bc4-8503-cf6c3bbce3d8","Type":"ContainerStarted","Data":"a6424a3b5a947722eb1d68e477c71c177598ae7227c58a1bbfaece46879a5823"} Dec 08 09:23:31 crc kubenswrapper[4776]: I1208 09:23:31.834807 4776 generic.go:334] "Generic (PLEG): container finished" podID="c968f892-a097-4b6d-885d-8a7b849714a3" containerID="475fd31bb672d52517df75eba98ad986f7428d516a94113935e951d100bdc9e9" exitCode=0 Dec 08 09:23:31 crc kubenswrapper[4776]: I1208 09:23:31.834953 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-9ljbr" event={"ID":"c968f892-a097-4b6d-885d-8a7b849714a3","Type":"ContainerDied","Data":"475fd31bb672d52517df75eba98ad986f7428d516a94113935e951d100bdc9e9"} Dec 08 09:23:31 crc kubenswrapper[4776]: I1208 09:23:31.886835 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7ccc58565d-s4sc8"] Dec 08 09:23:31 crc kubenswrapper[4776]: I1208 09:23:31.902326 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lpj75"] Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.275659 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.446767 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-ovsdbserver-nb\") pod \"c968f892-a097-4b6d-885d-8a7b849714a3\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.446849 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-config\") pod \"c968f892-a097-4b6d-885d-8a7b849714a3\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.447064 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2s8g\" (UniqueName: \"kubernetes.io/projected/c968f892-a097-4b6d-885d-8a7b849714a3-kube-api-access-v2s8g\") pod \"c968f892-a097-4b6d-885d-8a7b849714a3\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.447091 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-dns-swift-storage-0\") pod \"c968f892-a097-4b6d-885d-8a7b849714a3\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.447135 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-ovsdbserver-sb\") pod \"c968f892-a097-4b6d-885d-8a7b849714a3\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.447194 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-dns-svc\") pod \"c968f892-a097-4b6d-885d-8a7b849714a3\" (UID: \"c968f892-a097-4b6d-885d-8a7b849714a3\") " Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.469579 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c968f892-a097-4b6d-885d-8a7b849714a3-kube-api-access-v2s8g" (OuterVolumeSpecName: "kube-api-access-v2s8g") pod "c968f892-a097-4b6d-885d-8a7b849714a3" (UID: "c968f892-a097-4b6d-885d-8a7b849714a3"). InnerVolumeSpecName "kube-api-access-v2s8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.551057 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2s8g\" (UniqueName: \"kubernetes.io/projected/c968f892-a097-4b6d-885d-8a7b849714a3-kube-api-access-v2s8g\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.593809 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c968f892-a097-4b6d-885d-8a7b849714a3" (UID: "c968f892-a097-4b6d-885d-8a7b849714a3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.602458 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-config" (OuterVolumeSpecName: "config") pod "c968f892-a097-4b6d-885d-8a7b849714a3" (UID: "c968f892-a097-4b6d-885d-8a7b849714a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.617849 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c968f892-a097-4b6d-885d-8a7b849714a3" (UID: "c968f892-a097-4b6d-885d-8a7b849714a3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.654774 4776 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.654808 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.654817 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.672438 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c968f892-a097-4b6d-885d-8a7b849714a3" (UID: "c968f892-a097-4b6d-885d-8a7b849714a3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.695668 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c968f892-a097-4b6d-885d-8a7b849714a3" (UID: "c968f892-a097-4b6d-885d-8a7b849714a3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.756906 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.756938 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c968f892-a097-4b6d-885d-8a7b849714a3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.848420 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ccc58565d-s4sc8" event={"ID":"f5472c33-1c77-4bde-a438-924aa6a53a78","Type":"ContainerStarted","Data":"7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0"} Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.848483 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ccc58565d-s4sc8" event={"ID":"f5472c33-1c77-4bde-a438-924aa6a53a78","Type":"ContainerStarted","Data":"fb41ff30a5752bdcb5a27fce5ea4df31b70151ed9494ed8e575dc6eb6ed93925"} Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.850885 4776 generic.go:334] "Generic (PLEG): container finished" podID="5097e9b7-c005-4e68-bc20-bbd6f8b8a290" containerID="641d43bd51a47d050badbac4cbf44680b1e65636ade4a76fef3a08cba745fb00" exitCode=0 Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.850957 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" event={"ID":"5097e9b7-c005-4e68-bc20-bbd6f8b8a290","Type":"ContainerDied","Data":"641d43bd51a47d050badbac4cbf44680b1e65636ade4a76fef3a08cba745fb00"} Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.851485 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" event={"ID":"5097e9b7-c005-4e68-bc20-bbd6f8b8a290","Type":"ContainerStarted","Data":"1fcb3450aab63ec94c4d26b51f42fe4caef6b950f98b83e005cdc04fc9beb12c"} Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.885433 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-9ljbr" event={"ID":"c968f892-a097-4b6d-885d-8a7b849714a3","Type":"ContainerDied","Data":"931c6360e9af654b42f7c49ce37a38385acc9783e0c6b8a3d6c1468115c14f83"} Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.885496 4776 scope.go:117] "RemoveContainer" containerID="475fd31bb672d52517df75eba98ad986f7428d516a94113935e951d100bdc9e9" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.885639 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-9ljbr" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.897637 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75876fb99b-xnbd7" event={"ID":"ae330c18-0140-4bc4-8503-cf6c3bbce3d8","Type":"ContainerStarted","Data":"c106b383d1cfcb69516877def41d8442d219b8666fcda33e51c910b70349f44b"} Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.899077 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.899161 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.934765 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-75876fb99b-xnbd7" podStartSLOduration=3.934746447 podStartE2EDuration="3.934746447s" podCreationTimestamp="2025-12-08 09:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:32.919540189 +0000 UTC m=+1489.182765211" watchObservedRunningTime="2025-12-08 09:23:32.934746447 +0000 UTC m=+1489.197971469" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.961141 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-9ljbr"] Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.975511 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.975645 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.976253 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.980522 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 08 09:23:32 crc kubenswrapper[4776]: I1208 09:23:32.982848 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-9ljbr"] Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.093787 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.440639 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5d7dd8bd8b-9z2p2"] Dec 08 09:23:33 crc kubenswrapper[4776]: E1208 09:23:33.441239 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c968f892-a097-4b6d-885d-8a7b849714a3" containerName="init" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.441259 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c968f892-a097-4b6d-885d-8a7b849714a3" containerName="init" Dec 08 09:23:33 crc kubenswrapper[4776]: E1208 09:23:33.441290 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c968f892-a097-4b6d-885d-8a7b849714a3" containerName="dnsmasq-dns" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.441299 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c968f892-a097-4b6d-885d-8a7b849714a3" containerName="dnsmasq-dns" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.441546 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c968f892-a097-4b6d-885d-8a7b849714a3" containerName="dnsmasq-dns" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.443103 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.446333 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.446408 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.466450 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d7dd8bd8b-9z2p2"] Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.592213 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c29885-fdf1-4500-bee7-2b4102fb2c7e-combined-ca-bundle\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.592254 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c29885-fdf1-4500-bee7-2b4102fb2c7e-internal-tls-certs\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.592332 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71c29885-fdf1-4500-bee7-2b4102fb2c7e-config-data-custom\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.592472 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71c29885-fdf1-4500-bee7-2b4102fb2c7e-logs\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.592616 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgk4j\" (UniqueName: \"kubernetes.io/projected/71c29885-fdf1-4500-bee7-2b4102fb2c7e-kube-api-access-pgk4j\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.592813 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c29885-fdf1-4500-bee7-2b4102fb2c7e-config-data\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.592953 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c29885-fdf1-4500-bee7-2b4102fb2c7e-public-tls-certs\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.695077 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c29885-fdf1-4500-bee7-2b4102fb2c7e-config-data\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.695157 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c29885-fdf1-4500-bee7-2b4102fb2c7e-public-tls-certs\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.695220 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c29885-fdf1-4500-bee7-2b4102fb2c7e-combined-ca-bundle\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.695241 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c29885-fdf1-4500-bee7-2b4102fb2c7e-internal-tls-certs\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.695277 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71c29885-fdf1-4500-bee7-2b4102fb2c7e-config-data-custom\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.695308 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71c29885-fdf1-4500-bee7-2b4102fb2c7e-logs\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.695831 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71c29885-fdf1-4500-bee7-2b4102fb2c7e-logs\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.696289 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgk4j\" (UniqueName: \"kubernetes.io/projected/71c29885-fdf1-4500-bee7-2b4102fb2c7e-kube-api-access-pgk4j\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.702353 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c29885-fdf1-4500-bee7-2b4102fb2c7e-config-data\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.702408 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c29885-fdf1-4500-bee7-2b4102fb2c7e-public-tls-certs\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.704703 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71c29885-fdf1-4500-bee7-2b4102fb2c7e-config-data-custom\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.704736 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c29885-fdf1-4500-bee7-2b4102fb2c7e-combined-ca-bundle\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.705395 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c29885-fdf1-4500-bee7-2b4102fb2c7e-internal-tls-certs\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.720902 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgk4j\" (UniqueName: \"kubernetes.io/projected/71c29885-fdf1-4500-bee7-2b4102fb2c7e-kube-api-access-pgk4j\") pod \"barbican-api-5d7dd8bd8b-9z2p2\" (UID: \"71c29885-fdf1-4500-bee7-2b4102fb2c7e\") " pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:33 crc kubenswrapper[4776]: I1208 09:23:33.776277 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:34 crc kubenswrapper[4776]: I1208 09:23:34.364534 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c968f892-a097-4b6d-885d-8a7b849714a3" path="/var/lib/kubelet/pods/c968f892-a097-4b6d-885d-8a7b849714a3/volumes" Dec 08 09:23:36 crc kubenswrapper[4776]: I1208 09:23:36.342534 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 08 09:23:36 crc kubenswrapper[4776]: I1208 09:23:36.422952 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 09:23:36 crc kubenswrapper[4776]: I1208 09:23:36.939530 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e1094f09-518a-45fc-b0d5-b204ddf8ec85" containerName="cinder-scheduler" containerID="cri-o://f4ea81c1c16b42e732fc7c8826f147f7d03faf0335b9cdfcafbeb2c392206d3d" gracePeriod=30 Dec 08 09:23:36 crc kubenswrapper[4776]: I1208 09:23:36.939595 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e1094f09-518a-45fc-b0d5-b204ddf8ec85" containerName="probe" containerID="cri-o://8dd0b38b6b507cba9588499d3306ebee7fa547826a872bdce4fd32cf701505f0" gracePeriod=30 Dec 08 09:23:37 crc kubenswrapper[4776]: I1208 09:23:37.821600 4776 scope.go:117] "RemoveContainer" containerID="13e13279c8a4dc18d16ebc1901d1421b00784ba9ab0675a562a63307ddb6f651" Dec 08 09:23:37 crc kubenswrapper[4776]: I1208 09:23:37.959798 4776 generic.go:334] "Generic (PLEG): container finished" podID="e1094f09-518a-45fc-b0d5-b204ddf8ec85" containerID="8dd0b38b6b507cba9588499d3306ebee7fa547826a872bdce4fd32cf701505f0" exitCode=0 Dec 08 09:23:37 crc kubenswrapper[4776]: I1208 09:23:37.960368 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1094f09-518a-45fc-b0d5-b204ddf8ec85","Type":"ContainerDied","Data":"8dd0b38b6b507cba9588499d3306ebee7fa547826a872bdce4fd32cf701505f0"} Dec 08 09:23:38 crc kubenswrapper[4776]: E1208 09:23:38.332031 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="fab03865-02a6-4cd2-bf78-22ed25534301" Dec 08 09:23:38 crc kubenswrapper[4776]: I1208 09:23:38.415656 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 08 09:23:38 crc kubenswrapper[4776]: I1208 09:23:38.473991 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d7dd8bd8b-9z2p2"] Dec 08 09:23:38 crc kubenswrapper[4776]: W1208 09:23:38.488348 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71c29885_fdf1_4500_bee7_2b4102fb2c7e.slice/crio-f9c45c3231eeb99fa93e48de12df82ed097347586cd696a96ebe4d8d9cb82ae4 WatchSource:0}: Error finding container f9c45c3231eeb99fa93e48de12df82ed097347586cd696a96ebe4d8d9cb82ae4: Status 404 returned error can't find the container with id f9c45c3231eeb99fa93e48de12df82ed097347586cd696a96ebe4d8d9cb82ae4 Dec 08 09:23:38 crc kubenswrapper[4776]: I1208 09:23:38.976925 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fab03865-02a6-4cd2-bf78-22ed25534301","Type":"ContainerStarted","Data":"4d8193b912759b759044f4d0ed72d2b39076b00f490e90dab65610cd14758cc3"} Dec 08 09:23:38 crc kubenswrapper[4776]: I1208 09:23:38.977345 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fab03865-02a6-4cd2-bf78-22ed25534301" containerName="sg-core" containerID="cri-o://33df0a20f59c354f005d663c841d3064254836a3cc1ba39fa5380890fba6d42d" gracePeriod=30 Dec 08 09:23:38 crc kubenswrapper[4776]: I1208 09:23:38.977345 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fab03865-02a6-4cd2-bf78-22ed25534301" containerName="proxy-httpd" containerID="cri-o://4d8193b912759b759044f4d0ed72d2b39076b00f490e90dab65610cd14758cc3" gracePeriod=30 Dec 08 09:23:38 crc kubenswrapper[4776]: I1208 09:23:38.978483 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fab03865-02a6-4cd2-bf78-22ed25534301" containerName="ceilometer-notification-agent" containerID="cri-o://007fff4bf45758cb30afb7d09d5451695d3f321328fa25d9b157ce8986e3268e" gracePeriod=30 Dec 08 09:23:38 crc kubenswrapper[4776]: I1208 09:23:38.988280 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" event={"ID":"e501058f-25e0-456c-b23d-c7caafa729c3","Type":"ContainerStarted","Data":"a67f57aabd60df86ed35ed03bbe1137c584f5d13e3d809f08f9cc86a74b63ffb"} Dec 08 09:23:38 crc kubenswrapper[4776]: I1208 09:23:38.988344 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" event={"ID":"e501058f-25e0-456c-b23d-c7caafa729c3","Type":"ContainerStarted","Data":"a1777e11b28d4d99d62a4f824685e01d0f16b2399d9935d951d54c677c8dbacb"} Dec 08 09:23:38 crc kubenswrapper[4776]: I1208 09:23:38.995089 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ccc58565d-s4sc8" event={"ID":"f5472c33-1c77-4bde-a438-924aa6a53a78","Type":"ContainerStarted","Data":"adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352"} Dec 08 09:23:38 crc kubenswrapper[4776]: I1208 09:23:38.996387 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:38 crc kubenswrapper[4776]: I1208 09:23:38.996408 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:38 crc kubenswrapper[4776]: I1208 09:23:38.999072 4776 generic.go:334] "Generic (PLEG): container finished" podID="e1094f09-518a-45fc-b0d5-b204ddf8ec85" containerID="f4ea81c1c16b42e732fc7c8826f147f7d03faf0335b9cdfcafbeb2c392206d3d" exitCode=0 Dec 08 09:23:38 crc kubenswrapper[4776]: I1208 09:23:38.999119 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1094f09-518a-45fc-b0d5-b204ddf8ec85","Type":"ContainerDied","Data":"f4ea81c1c16b42e732fc7c8826f147f7d03faf0335b9cdfcafbeb2c392206d3d"} Dec 08 09:23:38 crc kubenswrapper[4776]: I1208 09:23:38.999137 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1094f09-518a-45fc-b0d5-b204ddf8ec85","Type":"ContainerDied","Data":"9409d8efba8d68b6b1539f819b411409a1a82c6e1cab4775c77b9339d8b5726d"} Dec 08 09:23:38 crc kubenswrapper[4776]: I1208 09:23:38.999149 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9409d8efba8d68b6b1539f819b411409a1a82c6e1cab4775c77b9339d8b5726d" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.003143 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" event={"ID":"5097e9b7-c005-4e68-bc20-bbd6f8b8a290","Type":"ContainerStarted","Data":"41725b711dd1839ad256cdbebf464427d14539155fc5d5914b3ae781490b9040"} Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.003978 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.024846 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-664c575c59-ncvpr" event={"ID":"d6258a3d-a50e-4cf4-af4d-e6f588d8744a","Type":"ContainerStarted","Data":"fe2a368d85859265db0267a2e0275d748a9525ca31d4a9eedef51832ba648933"} Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.024891 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-664c575c59-ncvpr" event={"ID":"d6258a3d-a50e-4cf4-af4d-e6f588d8744a","Type":"ContainerStarted","Data":"7c963c257cbb352e91eeadfce67a48baddb9f0f9253cb08a0c0bba5c1e382d02"} Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.028865 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" event={"ID":"71c29885-fdf1-4500-bee7-2b4102fb2c7e","Type":"ContainerStarted","Data":"bda50b8f7b759f0071bc70919ac26bd0f6f5452599c6d2f0c5256071d0910fb0"} Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.028901 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" event={"ID":"71c29885-fdf1-4500-bee7-2b4102fb2c7e","Type":"ContainerStarted","Data":"f9c45c3231eeb99fa93e48de12df82ed097347586cd696a96ebe4d8d9cb82ae4"} Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.033379 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" podStartSLOduration=9.033357159 podStartE2EDuration="9.033357159s" podCreationTimestamp="2025-12-08 09:23:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:39.032151108 +0000 UTC m=+1495.295376140" watchObservedRunningTime="2025-12-08 09:23:39.033357159 +0000 UTC m=+1495.296582181" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.073267 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.082354 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-78d78f694b-ck9wf" podStartSLOduration=3.291654501 podStartE2EDuration="10.082335313s" podCreationTimestamp="2025-12-08 09:23:29 +0000 UTC" firstStartedPulling="2025-12-08 09:23:31.239921222 +0000 UTC m=+1487.503146244" lastFinishedPulling="2025-12-08 09:23:38.030602024 +0000 UTC m=+1494.293827056" observedRunningTime="2025-12-08 09:23:39.057997921 +0000 UTC m=+1495.321222943" watchObservedRunningTime="2025-12-08 09:23:39.082335313 +0000 UTC m=+1495.345560335" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.109569 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7ccc58565d-s4sc8" podStartSLOduration=9.109545493 podStartE2EDuration="9.109545493s" podCreationTimestamp="2025-12-08 09:23:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:39.079510597 +0000 UTC m=+1495.342735619" watchObservedRunningTime="2025-12-08 09:23:39.109545493 +0000 UTC m=+1495.372770525" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.120462 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-664c575c59-ncvpr" podStartSLOduration=3.608907984 podStartE2EDuration="10.120438886s" podCreationTimestamp="2025-12-08 09:23:29 +0000 UTC" firstStartedPulling="2025-12-08 09:23:31.516068322 +0000 UTC m=+1487.779293344" lastFinishedPulling="2025-12-08 09:23:38.027599214 +0000 UTC m=+1494.290824246" observedRunningTime="2025-12-08 09:23:39.110319244 +0000 UTC m=+1495.373544276" watchObservedRunningTime="2025-12-08 09:23:39.120438886 +0000 UTC m=+1495.383663928" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.228306 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-scripts\") pod \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.228369 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1094f09-518a-45fc-b0d5-b204ddf8ec85-etc-machine-id\") pod \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.228594 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-config-data\") pod \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.228791 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5nbt\" (UniqueName: \"kubernetes.io/projected/e1094f09-518a-45fc-b0d5-b204ddf8ec85-kube-api-access-g5nbt\") pod \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.228860 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-combined-ca-bundle\") pod \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.228894 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-config-data-custom\") pod \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\" (UID: \"e1094f09-518a-45fc-b0d5-b204ddf8ec85\") " Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.229265 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1094f09-518a-45fc-b0d5-b204ddf8ec85-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e1094f09-518a-45fc-b0d5-b204ddf8ec85" (UID: "e1094f09-518a-45fc-b0d5-b204ddf8ec85"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.229520 4776 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1094f09-518a-45fc-b0d5-b204ddf8ec85-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.237414 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e1094f09-518a-45fc-b0d5-b204ddf8ec85" (UID: "e1094f09-518a-45fc-b0d5-b204ddf8ec85"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.237468 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1094f09-518a-45fc-b0d5-b204ddf8ec85-kube-api-access-g5nbt" (OuterVolumeSpecName: "kube-api-access-g5nbt") pod "e1094f09-518a-45fc-b0d5-b204ddf8ec85" (UID: "e1094f09-518a-45fc-b0d5-b204ddf8ec85"). InnerVolumeSpecName "kube-api-access-g5nbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.237571 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-scripts" (OuterVolumeSpecName: "scripts") pod "e1094f09-518a-45fc-b0d5-b204ddf8ec85" (UID: "e1094f09-518a-45fc-b0d5-b204ddf8ec85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.331641 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5nbt\" (UniqueName: \"kubernetes.io/projected/e1094f09-518a-45fc-b0d5-b204ddf8ec85-kube-api-access-g5nbt\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.331673 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.331682 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.338450 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1094f09-518a-45fc-b0d5-b204ddf8ec85" (UID: "e1094f09-518a-45fc-b0d5-b204ddf8ec85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.395218 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-config-data" (OuterVolumeSpecName: "config-data") pod "e1094f09-518a-45fc-b0d5-b204ddf8ec85" (UID: "e1094f09-518a-45fc-b0d5-b204ddf8ec85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.434157 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:39 crc kubenswrapper[4776]: I1208 09:23:39.434226 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1094f09-518a-45fc-b0d5-b204ddf8ec85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.040111 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" event={"ID":"71c29885-fdf1-4500-bee7-2b4102fb2c7e","Type":"ContainerStarted","Data":"8d66d34c84c4a9e31111ef5333d1ba4c092c8623fbde21d2b14ea9f9a9f3d7ce"} Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.040505 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.040718 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.042842 4776 generic.go:334] "Generic (PLEG): container finished" podID="fab03865-02a6-4cd2-bf78-22ed25534301" containerID="4d8193b912759b759044f4d0ed72d2b39076b00f490e90dab65610cd14758cc3" exitCode=0 Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.042882 4776 generic.go:334] "Generic (PLEG): container finished" podID="fab03865-02a6-4cd2-bf78-22ed25534301" containerID="33df0a20f59c354f005d663c841d3064254836a3cc1ba39fa5380890fba6d42d" exitCode=2 Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.043071 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.043068 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fab03865-02a6-4cd2-bf78-22ed25534301","Type":"ContainerDied","Data":"4d8193b912759b759044f4d0ed72d2b39076b00f490e90dab65610cd14758cc3"} Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.043252 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fab03865-02a6-4cd2-bf78-22ed25534301","Type":"ContainerDied","Data":"33df0a20f59c354f005d663c841d3064254836a3cc1ba39fa5380890fba6d42d"} Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.055514 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" podStartSLOduration=7.055484484 podStartE2EDuration="7.055484484s" podCreationTimestamp="2025-12-08 09:23:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:40.05495596 +0000 UTC m=+1496.318180982" watchObservedRunningTime="2025-12-08 09:23:40.055484484 +0000 UTC m=+1496.318709506" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.082870 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.092671 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.135740 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 09:23:40 crc kubenswrapper[4776]: E1208 09:23:40.136482 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1094f09-518a-45fc-b0d5-b204ddf8ec85" containerName="probe" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.136576 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1094f09-518a-45fc-b0d5-b204ddf8ec85" containerName="probe" Dec 08 09:23:40 crc kubenswrapper[4776]: E1208 09:23:40.136659 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1094f09-518a-45fc-b0d5-b204ddf8ec85" containerName="cinder-scheduler" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.136720 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1094f09-518a-45fc-b0d5-b204ddf8ec85" containerName="cinder-scheduler" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.136999 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1094f09-518a-45fc-b0d5-b204ddf8ec85" containerName="probe" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.137083 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1094f09-518a-45fc-b0d5-b204ddf8ec85" containerName="cinder-scheduler" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.138409 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.143136 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.155478 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.251330 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/289a9d84-e76a-42e5-9524-7e9b244b8743-config-data\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.251416 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwr6m\" (UniqueName: \"kubernetes.io/projected/289a9d84-e76a-42e5-9524-7e9b244b8743-kube-api-access-zwr6m\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.251439 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/289a9d84-e76a-42e5-9524-7e9b244b8743-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.251525 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/289a9d84-e76a-42e5-9524-7e9b244b8743-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.251541 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/289a9d84-e76a-42e5-9524-7e9b244b8743-scripts\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.251597 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/289a9d84-e76a-42e5-9524-7e9b244b8743-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.353120 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/289a9d84-e76a-42e5-9524-7e9b244b8743-config-data\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.353517 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwr6m\" (UniqueName: \"kubernetes.io/projected/289a9d84-e76a-42e5-9524-7e9b244b8743-kube-api-access-zwr6m\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.353543 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/289a9d84-e76a-42e5-9524-7e9b244b8743-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.353634 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/289a9d84-e76a-42e5-9524-7e9b244b8743-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.353651 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/289a9d84-e76a-42e5-9524-7e9b244b8743-scripts\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.353712 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/289a9d84-e76a-42e5-9524-7e9b244b8743-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.354301 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/289a9d84-e76a-42e5-9524-7e9b244b8743-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.360545 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/289a9d84-e76a-42e5-9524-7e9b244b8743-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.370364 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/289a9d84-e76a-42e5-9524-7e9b244b8743-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.377487 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/289a9d84-e76a-42e5-9524-7e9b244b8743-config-data\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.379334 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/289a9d84-e76a-42e5-9524-7e9b244b8743-scripts\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.382073 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwr6m\" (UniqueName: \"kubernetes.io/projected/289a9d84-e76a-42e5-9524-7e9b244b8743-kube-api-access-zwr6m\") pod \"cinder-scheduler-0\" (UID: \"289a9d84-e76a-42e5-9524-7e9b244b8743\") " pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.384655 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1094f09-518a-45fc-b0d5-b204ddf8ec85" path="/var/lib/kubelet/pods/e1094f09-518a-45fc-b0d5-b204ddf8ec85/volumes" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.389701 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.520504 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 08 09:23:40 crc kubenswrapper[4776]: I1208 09:23:40.992185 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.056279 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"289a9d84-e76a-42e5-9524-7e9b244b8743","Type":"ContainerStarted","Data":"b0e3204492d840ead8b972b71db2d41f8d0d7d73d248709b4bb35084856485d6"} Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.060965 4776 generic.go:334] "Generic (PLEG): container finished" podID="fab03865-02a6-4cd2-bf78-22ed25534301" containerID="007fff4bf45758cb30afb7d09d5451695d3f321328fa25d9b157ce8986e3268e" exitCode=0 Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.061063 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.062324 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fab03865-02a6-4cd2-bf78-22ed25534301","Type":"ContainerDied","Data":"007fff4bf45758cb30afb7d09d5451695d3f321328fa25d9b157ce8986e3268e"} Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.169446 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.260929 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.291674 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fab03865-02a6-4cd2-bf78-22ed25534301-log-httpd\") pod \"fab03865-02a6-4cd2-bf78-22ed25534301\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.291742 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-combined-ca-bundle\") pod \"fab03865-02a6-4cd2-bf78-22ed25534301\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.291768 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnvjn\" (UniqueName: \"kubernetes.io/projected/fab03865-02a6-4cd2-bf78-22ed25534301-kube-api-access-wnvjn\") pod \"fab03865-02a6-4cd2-bf78-22ed25534301\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.291801 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fab03865-02a6-4cd2-bf78-22ed25534301-run-httpd\") pod \"fab03865-02a6-4cd2-bf78-22ed25534301\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.291845 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-config-data\") pod \"fab03865-02a6-4cd2-bf78-22ed25534301\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.291891 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-sg-core-conf-yaml\") pod \"fab03865-02a6-4cd2-bf78-22ed25534301\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.291910 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-scripts\") pod \"fab03865-02a6-4cd2-bf78-22ed25534301\" (UID: \"fab03865-02a6-4cd2-bf78-22ed25534301\") " Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.292614 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fab03865-02a6-4cd2-bf78-22ed25534301-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fab03865-02a6-4cd2-bf78-22ed25534301" (UID: "fab03865-02a6-4cd2-bf78-22ed25534301"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.292834 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fab03865-02a6-4cd2-bf78-22ed25534301-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fab03865-02a6-4cd2-bf78-22ed25534301" (UID: "fab03865-02a6-4cd2-bf78-22ed25534301"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.296493 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-scripts" (OuterVolumeSpecName: "scripts") pod "fab03865-02a6-4cd2-bf78-22ed25534301" (UID: "fab03865-02a6-4cd2-bf78-22ed25534301"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.311488 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fab03865-02a6-4cd2-bf78-22ed25534301-kube-api-access-wnvjn" (OuterVolumeSpecName: "kube-api-access-wnvjn") pod "fab03865-02a6-4cd2-bf78-22ed25534301" (UID: "fab03865-02a6-4cd2-bf78-22ed25534301"). InnerVolumeSpecName "kube-api-access-wnvjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.323321 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fab03865-02a6-4cd2-bf78-22ed25534301" (UID: "fab03865-02a6-4cd2-bf78-22ed25534301"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.363338 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fab03865-02a6-4cd2-bf78-22ed25534301" (UID: "fab03865-02a6-4cd2-bf78-22ed25534301"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.394576 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fab03865-02a6-4cd2-bf78-22ed25534301-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.394603 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.394614 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnvjn\" (UniqueName: \"kubernetes.io/projected/fab03865-02a6-4cd2-bf78-22ed25534301-kube-api-access-wnvjn\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.394622 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fab03865-02a6-4cd2-bf78-22ed25534301-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.394630 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.394639 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.403614 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-config-data" (OuterVolumeSpecName: "config-data") pod "fab03865-02a6-4cd2-bf78-22ed25534301" (UID: "fab03865-02a6-4cd2-bf78-22ed25534301"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:41 crc kubenswrapper[4776]: I1208 09:23:41.497166 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fab03865-02a6-4cd2-bf78-22ed25534301-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.073667 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"289a9d84-e76a-42e5-9524-7e9b244b8743","Type":"ContainerStarted","Data":"476e91baf266b535786764ff829209971c32146c31f04cfa54ea5ef259d313fd"} Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.076402 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fab03865-02a6-4cd2-bf78-22ed25534301","Type":"ContainerDied","Data":"49d1730bcbf42067e54950304c081b37ef66fc485287100a48ff4fb4b23e7e86"} Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.076450 4776 scope.go:117] "RemoveContainer" containerID="4d8193b912759b759044f4d0ed72d2b39076b00f490e90dab65610cd14758cc3" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.076406 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.104874 4776 scope.go:117] "RemoveContainer" containerID="33df0a20f59c354f005d663c841d3064254836a3cc1ba39fa5380890fba6d42d" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.165063 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.181615 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.191282 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:23:42 crc kubenswrapper[4776]: E1208 09:23:42.191722 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab03865-02a6-4cd2-bf78-22ed25534301" containerName="proxy-httpd" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.191734 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab03865-02a6-4cd2-bf78-22ed25534301" containerName="proxy-httpd" Dec 08 09:23:42 crc kubenswrapper[4776]: E1208 09:23:42.191778 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab03865-02a6-4cd2-bf78-22ed25534301" containerName="sg-core" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.191784 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab03865-02a6-4cd2-bf78-22ed25534301" containerName="sg-core" Dec 08 09:23:42 crc kubenswrapper[4776]: E1208 09:23:42.191799 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab03865-02a6-4cd2-bf78-22ed25534301" containerName="ceilometer-notification-agent" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.191806 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab03865-02a6-4cd2-bf78-22ed25534301" containerName="ceilometer-notification-agent" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.191978 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab03865-02a6-4cd2-bf78-22ed25534301" containerName="proxy-httpd" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.192940 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab03865-02a6-4cd2-bf78-22ed25534301" containerName="sg-core" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.192968 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab03865-02a6-4cd2-bf78-22ed25534301" containerName="ceilometer-notification-agent" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.194887 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.197823 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.198017 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.222565 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.231361 4776 scope.go:117] "RemoveContainer" containerID="007fff4bf45758cb30afb7d09d5451695d3f321328fa25d9b157ce8986e3268e" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.332375 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.332718 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-log-httpd\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.332810 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.332905 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbn7z\" (UniqueName: \"kubernetes.io/projected/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-kube-api-access-kbn7z\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.332980 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-scripts\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.333045 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-config-data\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.333130 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-run-httpd\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.357333 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fab03865-02a6-4cd2-bf78-22ed25534301" path="/var/lib/kubelet/pods/fab03865-02a6-4cd2-bf78-22ed25534301/volumes" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.436344 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.437880 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-log-httpd\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.437918 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.437973 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbn7z\" (UniqueName: \"kubernetes.io/projected/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-kube-api-access-kbn7z\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.438009 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-scripts\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.438023 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-config-data\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.438046 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-run-httpd\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.439241 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-log-httpd\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.441586 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-run-httpd\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.443397 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.443963 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.456624 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-scripts\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.461014 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-config-data\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.465831 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbn7z\" (UniqueName: \"kubernetes.io/projected/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-kube-api-access-kbn7z\") pod \"ceilometer-0\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " pod="openstack/ceilometer-0" Dec 08 09:23:42 crc kubenswrapper[4776]: I1208 09:23:42.526950 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:23:43 crc kubenswrapper[4776]: I1208 09:23:43.097798 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e","Type":"ContainerStarted","Data":"99a6c242b2e92451488eab3c92eebe401be3f908bed1e1b3f1c5788a1a224095"} Dec 08 09:23:43 crc kubenswrapper[4776]: I1208 09:23:43.112131 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:23:43 crc kubenswrapper[4776]: I1208 09:23:43.675871 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:44 crc kubenswrapper[4776]: I1208 09:23:44.109486 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"289a9d84-e76a-42e5-9524-7e9b244b8743","Type":"ContainerStarted","Data":"3b5899e329ab4a7c7cc7f85fd2efb72c65527c78d27367fa92b3e4e40c4e2acb"} Dec 08 09:23:44 crc kubenswrapper[4776]: I1208 09:23:44.133573 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.133555594 podStartE2EDuration="4.133555594s" podCreationTimestamp="2025-12-08 09:23:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:44.128963621 +0000 UTC m=+1500.392188643" watchObservedRunningTime="2025-12-08 09:23:44.133555594 +0000 UTC m=+1500.396780616" Dec 08 09:23:45 crc kubenswrapper[4776]: I1208 09:23:45.072894 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5fd69d7-r446k" Dec 08 09:23:45 crc kubenswrapper[4776]: I1208 09:23:45.160427 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-775f79cd-lq4qd"] Dec 08 09:23:45 crc kubenswrapper[4776]: I1208 09:23:45.160743 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-775f79cd-lq4qd" podUID="84196301-9fa2-4acb-9a49-d87fdb571dfe" containerName="neutron-api" containerID="cri-o://b6e086130db2fd5957987439f44ccd67cfe0b9761b9f5b4c3f382d70be798431" gracePeriod=30 Dec 08 09:23:45 crc kubenswrapper[4776]: I1208 09:23:45.166151 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-775f79cd-lq4qd" podUID="84196301-9fa2-4acb-9a49-d87fdb571dfe" containerName="neutron-httpd" containerID="cri-o://daea7072fefcc23dac165f473d3f4cb1359947754ada16949fc11a82d8c6acac" gracePeriod=30 Dec 08 09:23:45 crc kubenswrapper[4776]: I1208 09:23:45.206910 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e","Type":"ContainerStarted","Data":"dab52e6e5c2b5cc7b58c79164315a06f1a8dc3f148813955e44361b3d0e9b8c7"} Dec 08 09:23:45 crc kubenswrapper[4776]: I1208 09:23:45.206949 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e","Type":"ContainerStarted","Data":"32afd0bfc8d7e6053ea28bb4cc93d2025748cb9ca639ba221f71097cc567be80"} Dec 08 09:23:45 crc kubenswrapper[4776]: I1208 09:23:45.334512 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:45 crc kubenswrapper[4776]: I1208 09:23:45.512355 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:23:45 crc kubenswrapper[4776]: I1208 09:23:45.521497 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 08 09:23:45 crc kubenswrapper[4776]: I1208 09:23:45.627369 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-9lrkl"] Dec 08 09:23:45 crc kubenswrapper[4776]: I1208 09:23:45.627807 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fb745b69-9lrkl" podUID="001c9704-cf27-4a31-8a61-3e5ce2e272eb" containerName="dnsmasq-dns" containerID="cri-o://6ac391ca9825822a2c218f1f30e3333d2154ab78fce7cfd6da43a7327afe0a0d" gracePeriod=10 Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.233902 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e","Type":"ContainerStarted","Data":"d7ba3e07bcf3270a0c6f2a6ee1806badf11ce50e5d636109f9f5dd2af97378ea"} Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.235483 4776 generic.go:334] "Generic (PLEG): container finished" podID="84196301-9fa2-4acb-9a49-d87fdb571dfe" containerID="daea7072fefcc23dac165f473d3f4cb1359947754ada16949fc11a82d8c6acac" exitCode=0 Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.235537 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-775f79cd-lq4qd" event={"ID":"84196301-9fa2-4acb-9a49-d87fdb571dfe","Type":"ContainerDied","Data":"daea7072fefcc23dac165f473d3f4cb1359947754ada16949fc11a82d8c6acac"} Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.237628 4776 generic.go:334] "Generic (PLEG): container finished" podID="001c9704-cf27-4a31-8a61-3e5ce2e272eb" containerID="6ac391ca9825822a2c218f1f30e3333d2154ab78fce7cfd6da43a7327afe0a0d" exitCode=0 Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.237706 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-9lrkl" event={"ID":"001c9704-cf27-4a31-8a61-3e5ce2e272eb","Type":"ContainerDied","Data":"6ac391ca9825822a2c218f1f30e3333d2154ab78fce7cfd6da43a7327afe0a0d"} Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.237735 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-9lrkl" event={"ID":"001c9704-cf27-4a31-8a61-3e5ce2e272eb","Type":"ContainerDied","Data":"81720cb0fb50ed1af9365af7b03959b40b1482af143ff9ec6760350a32cde2cd"} Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.237747 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81720cb0fb50ed1af9365af7b03959b40b1482af143ff9ec6760350a32cde2cd" Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.280723 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.331129 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-ovsdbserver-sb\") pod \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.331193 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v898k\" (UniqueName: \"kubernetes.io/projected/001c9704-cf27-4a31-8a61-3e5ce2e272eb-kube-api-access-v898k\") pod \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.331221 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-dns-svc\") pod \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.331319 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-config\") pod \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.331485 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-ovsdbserver-nb\") pod \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\" (UID: \"001c9704-cf27-4a31-8a61-3e5ce2e272eb\") " Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.339677 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/001c9704-cf27-4a31-8a61-3e5ce2e272eb-kube-api-access-v898k" (OuterVolumeSpecName: "kube-api-access-v898k") pod "001c9704-cf27-4a31-8a61-3e5ce2e272eb" (UID: "001c9704-cf27-4a31-8a61-3e5ce2e272eb"). InnerVolumeSpecName "kube-api-access-v898k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.408392 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "001c9704-cf27-4a31-8a61-3e5ce2e272eb" (UID: "001c9704-cf27-4a31-8a61-3e5ce2e272eb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.434469 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.434498 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v898k\" (UniqueName: \"kubernetes.io/projected/001c9704-cf27-4a31-8a61-3e5ce2e272eb-kube-api-access-v898k\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.435483 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-config" (OuterVolumeSpecName: "config") pod "001c9704-cf27-4a31-8a61-3e5ce2e272eb" (UID: "001c9704-cf27-4a31-8a61-3e5ce2e272eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.448730 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "001c9704-cf27-4a31-8a61-3e5ce2e272eb" (UID: "001c9704-cf27-4a31-8a61-3e5ce2e272eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.473146 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "001c9704-cf27-4a31-8a61-3e5ce2e272eb" (UID: "001c9704-cf27-4a31-8a61-3e5ce2e272eb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.536121 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.536152 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:46 crc kubenswrapper[4776]: I1208 09:23:46.536162 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/001c9704-cf27-4a31-8a61-3e5ce2e272eb-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:47 crc kubenswrapper[4776]: I1208 09:23:47.258138 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-9lrkl" Dec 08 09:23:47 crc kubenswrapper[4776]: I1208 09:23:47.258877 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e","Type":"ContainerStarted","Data":"1a3f1a063a0f591306ead18f7f9eb4f72ba127ba9a8eab4f177ba65a881cf158"} Dec 08 09:23:47 crc kubenswrapper[4776]: I1208 09:23:47.259966 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 09:23:47 crc kubenswrapper[4776]: I1208 09:23:47.296244 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.430421658 podStartE2EDuration="5.296226322s" podCreationTimestamp="2025-12-08 09:23:42 +0000 UTC" firstStartedPulling="2025-12-08 09:23:43.078333831 +0000 UTC m=+1499.341558863" lastFinishedPulling="2025-12-08 09:23:46.944138505 +0000 UTC m=+1503.207363527" observedRunningTime="2025-12-08 09:23:47.287293932 +0000 UTC m=+1503.550518954" watchObservedRunningTime="2025-12-08 09:23:47.296226322 +0000 UTC m=+1503.559451344" Dec 08 09:23:47 crc kubenswrapper[4776]: I1208 09:23:47.313450 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d7dd8bd8b-9z2p2" Dec 08 09:23:47 crc kubenswrapper[4776]: I1208 09:23:47.314237 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-9lrkl"] Dec 08 09:23:47 crc kubenswrapper[4776]: I1208 09:23:47.323943 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-9lrkl"] Dec 08 09:23:47 crc kubenswrapper[4776]: I1208 09:23:47.362772 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7ccc58565d-s4sc8"] Dec 08 09:23:47 crc kubenswrapper[4776]: I1208 09:23:47.363301 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7ccc58565d-s4sc8" podUID="f5472c33-1c77-4bde-a438-924aa6a53a78" containerName="barbican-api-log" containerID="cri-o://7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0" gracePeriod=30 Dec 08 09:23:47 crc kubenswrapper[4776]: I1208 09:23:47.364189 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7ccc58565d-s4sc8" podUID="f5472c33-1c77-4bde-a438-924aa6a53a78" containerName="barbican-api" containerID="cri-o://adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352" gracePeriod=30 Dec 08 09:23:48 crc kubenswrapper[4776]: I1208 09:23:48.269099 4776 generic.go:334] "Generic (PLEG): container finished" podID="f5472c33-1c77-4bde-a438-924aa6a53a78" containerID="7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0" exitCode=143 Dec 08 09:23:48 crc kubenswrapper[4776]: I1208 09:23:48.270390 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ccc58565d-s4sc8" event={"ID":"f5472c33-1c77-4bde-a438-924aa6a53a78","Type":"ContainerDied","Data":"7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0"} Dec 08 09:23:48 crc kubenswrapper[4776]: I1208 09:23:48.354969 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="001c9704-cf27-4a31-8a61-3e5ce2e272eb" path="/var/lib/kubelet/pods/001c9704-cf27-4a31-8a61-3e5ce2e272eb/volumes" Dec 08 09:23:49 crc kubenswrapper[4776]: I1208 09:23:49.546211 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-77496dd4f7-8gxmg" Dec 08 09:23:50 crc kubenswrapper[4776]: I1208 09:23:50.561812 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7ccc58565d-s4sc8" podUID="f5472c33-1c77-4bde-a438-924aa6a53a78" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.199:9311/healthcheck\": read tcp 10.217.0.2:50556->10.217.0.199:9311: read: connection reset by peer" Dec 08 09:23:50 crc kubenswrapper[4776]: I1208 09:23:50.561833 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7ccc58565d-s4sc8" podUID="f5472c33-1c77-4bde-a438-924aa6a53a78" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.199:9311/healthcheck\": read tcp 10.217.0.2:50552->10.217.0.199:9311: read: connection reset by peer" Dec 08 09:23:50 crc kubenswrapper[4776]: I1208 09:23:50.804357 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.139934 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.238594 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k4bq\" (UniqueName: \"kubernetes.io/projected/f5472c33-1c77-4bde-a438-924aa6a53a78-kube-api-access-8k4bq\") pod \"f5472c33-1c77-4bde-a438-924aa6a53a78\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.238782 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5472c33-1c77-4bde-a438-924aa6a53a78-logs\") pod \"f5472c33-1c77-4bde-a438-924aa6a53a78\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.238877 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-config-data\") pod \"f5472c33-1c77-4bde-a438-924aa6a53a78\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.239047 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-combined-ca-bundle\") pod \"f5472c33-1c77-4bde-a438-924aa6a53a78\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.239071 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-config-data-custom\") pod \"f5472c33-1c77-4bde-a438-924aa6a53a78\" (UID: \"f5472c33-1c77-4bde-a438-924aa6a53a78\") " Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.239473 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5472c33-1c77-4bde-a438-924aa6a53a78-logs" (OuterVolumeSpecName: "logs") pod "f5472c33-1c77-4bde-a438-924aa6a53a78" (UID: "f5472c33-1c77-4bde-a438-924aa6a53a78"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.239710 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5472c33-1c77-4bde-a438-924aa6a53a78-logs\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.245284 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5472c33-1c77-4bde-a438-924aa6a53a78-kube-api-access-8k4bq" (OuterVolumeSpecName: "kube-api-access-8k4bq") pod "f5472c33-1c77-4bde-a438-924aa6a53a78" (UID: "f5472c33-1c77-4bde-a438-924aa6a53a78"). InnerVolumeSpecName "kube-api-access-8k4bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.246139 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f5472c33-1c77-4bde-a438-924aa6a53a78" (UID: "f5472c33-1c77-4bde-a438-924aa6a53a78"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.276261 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5472c33-1c77-4bde-a438-924aa6a53a78" (UID: "f5472c33-1c77-4bde-a438-924aa6a53a78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.304074 4776 generic.go:334] "Generic (PLEG): container finished" podID="f5472c33-1c77-4bde-a438-924aa6a53a78" containerID="adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352" exitCode=0 Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.304118 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ccc58565d-s4sc8" event={"ID":"f5472c33-1c77-4bde-a438-924aa6a53a78","Type":"ContainerDied","Data":"adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352"} Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.304144 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ccc58565d-s4sc8" event={"ID":"f5472c33-1c77-4bde-a438-924aa6a53a78","Type":"ContainerDied","Data":"fb41ff30a5752bdcb5a27fce5ea4df31b70151ed9494ed8e575dc6eb6ed93925"} Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.304165 4776 scope.go:117] "RemoveContainer" containerID="adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.304333 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ccc58565d-s4sc8" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.312296 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-config-data" (OuterVolumeSpecName: "config-data") pod "f5472c33-1c77-4bde-a438-924aa6a53a78" (UID: "f5472c33-1c77-4bde-a438-924aa6a53a78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.341979 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.342012 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.342023 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5472c33-1c77-4bde-a438-924aa6a53a78-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.342033 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k4bq\" (UniqueName: \"kubernetes.io/projected/f5472c33-1c77-4bde-a438-924aa6a53a78-kube-api-access-8k4bq\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.390641 4776 scope.go:117] "RemoveContainer" containerID="7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.414999 4776 scope.go:117] "RemoveContainer" containerID="adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352" Dec 08 09:23:51 crc kubenswrapper[4776]: E1208 09:23:51.415605 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352\": container with ID starting with adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352 not found: ID does not exist" containerID="adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.415669 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352"} err="failed to get container status \"adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352\": rpc error: code = NotFound desc = could not find container \"adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352\": container with ID starting with adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352 not found: ID does not exist" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.415692 4776 scope.go:117] "RemoveContainer" containerID="7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0" Dec 08 09:23:51 crc kubenswrapper[4776]: E1208 09:23:51.416075 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0\": container with ID starting with 7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0 not found: ID does not exist" containerID="7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.416095 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0"} err="failed to get container status \"7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0\": rpc error: code = NotFound desc = could not find container \"7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0\": container with ID starting with 7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0 not found: ID does not exist" Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.719549 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7ccc58565d-s4sc8"] Dec 08 09:23:51 crc kubenswrapper[4776]: I1208 09:23:51.728985 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7ccc58565d-s4sc8"] Dec 08 09:23:52 crc kubenswrapper[4776]: I1208 09:23:52.355980 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5472c33-1c77-4bde-a438-924aa6a53a78" path="/var/lib/kubelet/pods/f5472c33-1c77-4bde-a438-924aa6a53a78/volumes" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.404596 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5cfbc4c5f-hhnf9"] Dec 08 09:23:53 crc kubenswrapper[4776]: E1208 09:23:53.405578 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5472c33-1c77-4bde-a438-924aa6a53a78" containerName="barbican-api" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.405592 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5472c33-1c77-4bde-a438-924aa6a53a78" containerName="barbican-api" Dec 08 09:23:53 crc kubenswrapper[4776]: E1208 09:23:53.405659 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001c9704-cf27-4a31-8a61-3e5ce2e272eb" containerName="init" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.405668 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="001c9704-cf27-4a31-8a61-3e5ce2e272eb" containerName="init" Dec 08 09:23:53 crc kubenswrapper[4776]: E1208 09:23:53.405682 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001c9704-cf27-4a31-8a61-3e5ce2e272eb" containerName="dnsmasq-dns" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.405689 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="001c9704-cf27-4a31-8a61-3e5ce2e272eb" containerName="dnsmasq-dns" Dec 08 09:23:53 crc kubenswrapper[4776]: E1208 09:23:53.405699 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5472c33-1c77-4bde-a438-924aa6a53a78" containerName="barbican-api-log" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.405705 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5472c33-1c77-4bde-a438-924aa6a53a78" containerName="barbican-api-log" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.405997 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5472c33-1c77-4bde-a438-924aa6a53a78" containerName="barbican-api-log" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.406016 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="001c9704-cf27-4a31-8a61-3e5ce2e272eb" containerName="dnsmasq-dns" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.406061 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5472c33-1c77-4bde-a438-924aa6a53a78" containerName="barbican-api" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.407492 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.411247 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.411612 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.411763 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.426433 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5cfbc4c5f-hhnf9"] Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.488372 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-internal-tls-certs\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.488444 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-combined-ca-bundle\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.488538 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-run-httpd\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.488604 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-public-tls-certs\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.488672 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl95r\" (UniqueName: \"kubernetes.io/projected/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-kube-api-access-fl95r\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.488842 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-config-data\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.488886 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-etc-swift\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.489029 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-log-httpd\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.590335 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-run-httpd\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.590400 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-public-tls-certs\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.590450 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl95r\" (UniqueName: \"kubernetes.io/projected/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-kube-api-access-fl95r\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.590473 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-config-data\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.590487 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-etc-swift\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.590534 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-log-httpd\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.590601 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-internal-tls-certs\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.590629 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-combined-ca-bundle\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.590969 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-run-httpd\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.591408 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-log-httpd\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.606469 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-config-data\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.609194 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-public-tls-certs\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.610162 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-etc-swift\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.611721 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl95r\" (UniqueName: \"kubernetes.io/projected/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-kube-api-access-fl95r\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.616311 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-combined-ca-bundle\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.618471 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5389fb-4ae8-45b1-baaf-18f2fea3f61c-internal-tls-certs\") pod \"swift-proxy-5cfbc4c5f-hhnf9\" (UID: \"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c\") " pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:53 crc kubenswrapper[4776]: I1208 09:23:53.735655 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.288411 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5cfbc4c5f-hhnf9"] Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.366778 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.370515 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" event={"ID":"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c","Type":"ContainerStarted","Data":"9e051d21095ff38b57a8927f330d5fb8d6f24ab27982eafe44679a44599ccad8"} Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.370869 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.375485 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.375552 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-shw5j" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.375635 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.388824 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.417001 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.417066 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-openstack-config-secret\") pod \"openstackclient\" (UID: \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.417238 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-openstack-config\") pod \"openstackclient\" (UID: \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.417264 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhltm\" (UniqueName: \"kubernetes.io/projected/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-kube-api-access-mhltm\") pod \"openstackclient\" (UID: \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.518622 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-openstack-config-secret\") pod \"openstackclient\" (UID: \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.518769 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-openstack-config\") pod \"openstackclient\" (UID: \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.518806 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhltm\" (UniqueName: \"kubernetes.io/projected/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-kube-api-access-mhltm\") pod \"openstackclient\" (UID: \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.518871 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.520102 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-openstack-config\") pod \"openstackclient\" (UID: \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.523667 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-openstack-config-secret\") pod \"openstackclient\" (UID: \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.526724 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.534256 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhltm\" (UniqueName: \"kubernetes.io/projected/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-kube-api-access-mhltm\") pod \"openstackclient\" (UID: \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.601400 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.602325 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.635222 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.653223 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.654711 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.667195 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.723056 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8606b034-7364-4dce-bea0-7c0e2067ee95-openstack-config-secret\") pod \"openstackclient\" (UID: \"8606b034-7364-4dce-bea0-7c0e2067ee95\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.723246 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8606b034-7364-4dce-bea0-7c0e2067ee95-openstack-config\") pod \"openstackclient\" (UID: \"8606b034-7364-4dce-bea0-7c0e2067ee95\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.723288 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8606b034-7364-4dce-bea0-7c0e2067ee95-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8606b034-7364-4dce-bea0-7c0e2067ee95\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.723314 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gjk8\" (UniqueName: \"kubernetes.io/projected/8606b034-7364-4dce-bea0-7c0e2067ee95-kube-api-access-7gjk8\") pod \"openstackclient\" (UID: \"8606b034-7364-4dce-bea0-7c0e2067ee95\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: E1208 09:23:54.760688 4776 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 08 09:23:54 crc kubenswrapper[4776]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_2d51b4c0-93a1-4af4-95ed-a9b169ebb077_0(5151d894601210c9a361f426d624c6a2c8d63ca52814e90cf16c152c91f678dd): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"5151d894601210c9a361f426d624c6a2c8d63ca52814e90cf16c152c91f678dd" Netns:"/var/run/netns/ddebf63d-d6f5-4d3a-abdf-cfee2ba386b6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=5151d894601210c9a361f426d624c6a2c8d63ca52814e90cf16c152c91f678dd;K8S_POD_UID=2d51b4c0-93a1-4af4-95ed-a9b169ebb077" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/2d51b4c0-93a1-4af4-95ed-a9b169ebb077]: expected pod UID "2d51b4c0-93a1-4af4-95ed-a9b169ebb077" but got "8606b034-7364-4dce-bea0-7c0e2067ee95" from Kube API Dec 08 09:23:54 crc kubenswrapper[4776]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 08 09:23:54 crc kubenswrapper[4776]: > Dec 08 09:23:54 crc kubenswrapper[4776]: E1208 09:23:54.760755 4776 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 08 09:23:54 crc kubenswrapper[4776]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_2d51b4c0-93a1-4af4-95ed-a9b169ebb077_0(5151d894601210c9a361f426d624c6a2c8d63ca52814e90cf16c152c91f678dd): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"5151d894601210c9a361f426d624c6a2c8d63ca52814e90cf16c152c91f678dd" Netns:"/var/run/netns/ddebf63d-d6f5-4d3a-abdf-cfee2ba386b6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=5151d894601210c9a361f426d624c6a2c8d63ca52814e90cf16c152c91f678dd;K8S_POD_UID=2d51b4c0-93a1-4af4-95ed-a9b169ebb077" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/2d51b4c0-93a1-4af4-95ed-a9b169ebb077]: expected pod UID "2d51b4c0-93a1-4af4-95ed-a9b169ebb077" but got "8606b034-7364-4dce-bea0-7c0e2067ee95" from Kube API Dec 08 09:23:54 crc kubenswrapper[4776]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 08 09:23:54 crc kubenswrapper[4776]: > pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.826764 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8606b034-7364-4dce-bea0-7c0e2067ee95-openstack-config-secret\") pod \"openstackclient\" (UID: \"8606b034-7364-4dce-bea0-7c0e2067ee95\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.826932 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8606b034-7364-4dce-bea0-7c0e2067ee95-openstack-config\") pod \"openstackclient\" (UID: \"8606b034-7364-4dce-bea0-7c0e2067ee95\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.826976 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8606b034-7364-4dce-bea0-7c0e2067ee95-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8606b034-7364-4dce-bea0-7c0e2067ee95\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.827109 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gjk8\" (UniqueName: \"kubernetes.io/projected/8606b034-7364-4dce-bea0-7c0e2067ee95-kube-api-access-7gjk8\") pod \"openstackclient\" (UID: \"8606b034-7364-4dce-bea0-7c0e2067ee95\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.828742 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8606b034-7364-4dce-bea0-7c0e2067ee95-openstack-config\") pod \"openstackclient\" (UID: \"8606b034-7364-4dce-bea0-7c0e2067ee95\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.830733 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8606b034-7364-4dce-bea0-7c0e2067ee95-openstack-config-secret\") pod \"openstackclient\" (UID: \"8606b034-7364-4dce-bea0-7c0e2067ee95\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.842524 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gjk8\" (UniqueName: \"kubernetes.io/projected/8606b034-7364-4dce-bea0-7c0e2067ee95-kube-api-access-7gjk8\") pod \"openstackclient\" (UID: \"8606b034-7364-4dce-bea0-7c0e2067ee95\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.843085 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8606b034-7364-4dce-bea0-7c0e2067ee95-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8606b034-7364-4dce-bea0-7c0e2067ee95\") " pod="openstack/openstackclient" Dec 08 09:23:54 crc kubenswrapper[4776]: I1208 09:23:54.866053 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.344712 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.367106 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.367451 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerName="ceilometer-central-agent" containerID="cri-o://32afd0bfc8d7e6053ea28bb4cc93d2025748cb9ca639ba221f71097cc567be80" gracePeriod=30 Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.367782 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerName="sg-core" containerID="cri-o://d7ba3e07bcf3270a0c6f2a6ee1806badf11ce50e5d636109f9f5dd2af97378ea" gracePeriod=30 Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.367825 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerName="ceilometer-notification-agent" containerID="cri-o://dab52e6e5c2b5cc7b58c79164315a06f1a8dc3f148813955e44361b3d0e9b8c7" gracePeriod=30 Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.367830 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerName="proxy-httpd" containerID="cri-o://1a3f1a063a0f591306ead18f7f9eb4f72ba127ba9a8eab4f177ba65a881cf158" gracePeriod=30 Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.391537 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8606b034-7364-4dce-bea0-7c0e2067ee95","Type":"ContainerStarted","Data":"7b2e04f31075d86bfc889adb46889382636ef70e730d46a1ff6a7b073285bc42"} Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.394785 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.394854 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" event={"ID":"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c","Type":"ContainerStarted","Data":"9109299a3426a1239b975486e8e2d8b0e1e8d0841865fe7048c683f6d182feab"} Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.394892 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" event={"ID":"aa5389fb-4ae8-45b1-baaf-18f2fea3f61c","Type":"ContainerStarted","Data":"2b5eb112ebb34891fa5c92f9a00d347f446605df07f7ddf0d00a30f63fb40a4f"} Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.406161 4776 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="2d51b4c0-93a1-4af4-95ed-a9b169ebb077" podUID="8606b034-7364-4dce-bea0-7c0e2067ee95" Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.406619 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.435968 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" podStartSLOduration=2.43594985 podStartE2EDuration="2.43594985s" podCreationTimestamp="2025-12-08 09:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:23:55.428973704 +0000 UTC m=+1511.692198726" watchObservedRunningTime="2025-12-08 09:23:55.43594985 +0000 UTC m=+1511.699174862" Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.440528 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-combined-ca-bundle\") pod \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\" (UID: \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\") " Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.440592 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-openstack-config\") pod \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\" (UID: \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\") " Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.444146 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "2d51b4c0-93a1-4af4-95ed-a9b169ebb077" (UID: "2d51b4c0-93a1-4af4-95ed-a9b169ebb077"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.445685 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d51b4c0-93a1-4af4-95ed-a9b169ebb077" (UID: "2d51b4c0-93a1-4af4-95ed-a9b169ebb077"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.542167 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-openstack-config-secret\") pod \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\" (UID: \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\") " Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.542321 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhltm\" (UniqueName: \"kubernetes.io/projected/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-kube-api-access-mhltm\") pod \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\" (UID: \"2d51b4c0-93a1-4af4-95ed-a9b169ebb077\") " Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.543082 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.543105 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.548220 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "2d51b4c0-93a1-4af4-95ed-a9b169ebb077" (UID: "2d51b4c0-93a1-4af4-95ed-a9b169ebb077"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.548277 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-kube-api-access-mhltm" (OuterVolumeSpecName: "kube-api-access-mhltm") pod "2d51b4c0-93a1-4af4-95ed-a9b169ebb077" (UID: "2d51b4c0-93a1-4af4-95ed-a9b169ebb077"). InnerVolumeSpecName "kube-api-access-mhltm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.665661 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:55 crc kubenswrapper[4776]: I1208 09:23:55.665706 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhltm\" (UniqueName: \"kubernetes.io/projected/2d51b4c0-93a1-4af4-95ed-a9b169ebb077-kube-api-access-mhltm\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:56 crc kubenswrapper[4776]: I1208 09:23:56.357832 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d51b4c0-93a1-4af4-95ed-a9b169ebb077" path="/var/lib/kubelet/pods/2d51b4c0-93a1-4af4-95ed-a9b169ebb077/volumes" Dec 08 09:23:56 crc kubenswrapper[4776]: I1208 09:23:56.410363 4776 generic.go:334] "Generic (PLEG): container finished" podID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerID="1a3f1a063a0f591306ead18f7f9eb4f72ba127ba9a8eab4f177ba65a881cf158" exitCode=0 Dec 08 09:23:56 crc kubenswrapper[4776]: I1208 09:23:56.410408 4776 generic.go:334] "Generic (PLEG): container finished" podID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerID="d7ba3e07bcf3270a0c6f2a6ee1806badf11ce50e5d636109f9f5dd2af97378ea" exitCode=2 Dec 08 09:23:56 crc kubenswrapper[4776]: I1208 09:23:56.410421 4776 generic.go:334] "Generic (PLEG): container finished" podID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerID="32afd0bfc8d7e6053ea28bb4cc93d2025748cb9ca639ba221f71097cc567be80" exitCode=0 Dec 08 09:23:56 crc kubenswrapper[4776]: I1208 09:23:56.410459 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e","Type":"ContainerDied","Data":"1a3f1a063a0f591306ead18f7f9eb4f72ba127ba9a8eab4f177ba65a881cf158"} Dec 08 09:23:56 crc kubenswrapper[4776]: I1208 09:23:56.410519 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 08 09:23:56 crc kubenswrapper[4776]: I1208 09:23:56.410532 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e","Type":"ContainerDied","Data":"d7ba3e07bcf3270a0c6f2a6ee1806badf11ce50e5d636109f9f5dd2af97378ea"} Dec 08 09:23:56 crc kubenswrapper[4776]: I1208 09:23:56.410549 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e","Type":"ContainerDied","Data":"32afd0bfc8d7e6053ea28bb4cc93d2025748cb9ca639ba221f71097cc567be80"} Dec 08 09:23:56 crc kubenswrapper[4776]: I1208 09:23:56.411198 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:56 crc kubenswrapper[4776]: I1208 09:23:56.411320 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:23:56 crc kubenswrapper[4776]: I1208 09:23:56.416953 4776 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="2d51b4c0-93a1-4af4-95ed-a9b169ebb077" podUID="8606b034-7364-4dce-bea0-7c0e2067ee95" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.426065 4776 generic.go:334] "Generic (PLEG): container finished" podID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerID="dab52e6e5c2b5cc7b58c79164315a06f1a8dc3f148813955e44361b3d0e9b8c7" exitCode=0 Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.426100 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e","Type":"ContainerDied","Data":"dab52e6e5c2b5cc7b58c79164315a06f1a8dc3f148813955e44361b3d0e9b8c7"} Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.431969 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-775f79cd-lq4qd" event={"ID":"84196301-9fa2-4acb-9a49-d87fdb571dfe","Type":"ContainerDied","Data":"b6e086130db2fd5957987439f44ccd67cfe0b9761b9f5b4c3f382d70be798431"} Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.431158 4776 generic.go:334] "Generic (PLEG): container finished" podID="84196301-9fa2-4acb-9a49-d87fdb571dfe" containerID="b6e086130db2fd5957987439f44ccd67cfe0b9761b9f5b4c3f382d70be798431" exitCode=0 Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.610081 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.631749 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.714233 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-combined-ca-bundle\") pod \"84196301-9fa2-4acb-9a49-d87fdb571dfe\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.714335 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-ovndb-tls-certs\") pod \"84196301-9fa2-4acb-9a49-d87fdb571dfe\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.714363 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-config-data\") pod \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.714382 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-run-httpd\") pod \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.715127 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-combined-ca-bundle\") pod \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.715053 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" (UID: "76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.715365 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbn7z\" (UniqueName: \"kubernetes.io/projected/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-kube-api-access-kbn7z\") pod \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.715390 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-log-httpd\") pod \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.715410 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-sg-core-conf-yaml\") pod \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.715441 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-httpd-config\") pod \"84196301-9fa2-4acb-9a49-d87fdb571dfe\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.715493 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-config\") pod \"84196301-9fa2-4acb-9a49-d87fdb571dfe\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.715519 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wgjl\" (UniqueName: \"kubernetes.io/projected/84196301-9fa2-4acb-9a49-d87fdb571dfe-kube-api-access-9wgjl\") pod \"84196301-9fa2-4acb-9a49-d87fdb571dfe\" (UID: \"84196301-9fa2-4acb-9a49-d87fdb571dfe\") " Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.715565 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-scripts\") pod \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\" (UID: \"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e\") " Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.715997 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.721309 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" (UID: "76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.721461 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-scripts" (OuterVolumeSpecName: "scripts") pod "76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" (UID: "76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.723479 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "84196301-9fa2-4acb-9a49-d87fdb571dfe" (UID: "84196301-9fa2-4acb-9a49-d87fdb571dfe"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.724563 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-kube-api-access-kbn7z" (OuterVolumeSpecName: "kube-api-access-kbn7z") pod "76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" (UID: "76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e"). InnerVolumeSpecName "kube-api-access-kbn7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.736056 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84196301-9fa2-4acb-9a49-d87fdb571dfe-kube-api-access-9wgjl" (OuterVolumeSpecName: "kube-api-access-9wgjl") pod "84196301-9fa2-4acb-9a49-d87fdb571dfe" (UID: "84196301-9fa2-4acb-9a49-d87fdb571dfe"). InnerVolumeSpecName "kube-api-access-9wgjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.756758 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" (UID: "76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.781146 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84196301-9fa2-4acb-9a49-d87fdb571dfe" (UID: "84196301-9fa2-4acb-9a49-d87fdb571dfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.786038 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-config" (OuterVolumeSpecName: "config") pod "84196301-9fa2-4acb-9a49-d87fdb571dfe" (UID: "84196301-9fa2-4acb-9a49-d87fdb571dfe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.810907 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "84196301-9fa2-4acb-9a49-d87fdb571dfe" (UID: "84196301-9fa2-4acb-9a49-d87fdb571dfe"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.817068 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" (UID: "76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.818821 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.818851 4776 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.818861 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.818873 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbn7z\" (UniqueName: \"kubernetes.io/projected/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-kube-api-access-kbn7z\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.818887 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.818896 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.818907 4776 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.818918 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/84196301-9fa2-4acb-9a49-d87fdb571dfe-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.818927 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wgjl\" (UniqueName: \"kubernetes.io/projected/84196301-9fa2-4acb-9a49-d87fdb571dfe-kube-api-access-9wgjl\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.818939 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.843380 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-config-data" (OuterVolumeSpecName: "config-data") pod "76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" (UID: "76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:23:57 crc kubenswrapper[4776]: I1208 09:23:57.921365 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.456546 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-775f79cd-lq4qd" event={"ID":"84196301-9fa2-4acb-9a49-d87fdb571dfe","Type":"ContainerDied","Data":"64fc18eda76c24ac93cec50dfdf9324c0d1be82adbe44a3cfcf13acd8a740bb6"} Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.456601 4776 scope.go:117] "RemoveContainer" containerID="daea7072fefcc23dac165f473d3f4cb1359947754ada16949fc11a82d8c6acac" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.456713 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-775f79cd-lq4qd" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.463301 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e","Type":"ContainerDied","Data":"99a6c242b2e92451488eab3c92eebe401be3f908bed1e1b3f1c5788a1a224095"} Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.463406 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.500776 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-775f79cd-lq4qd"] Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.501326 4776 scope.go:117] "RemoveContainer" containerID="b6e086130db2fd5957987439f44ccd67cfe0b9761b9f5b4c3f382d70be798431" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.520123 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-775f79cd-lq4qd"] Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.534224 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.539395 4776 scope.go:117] "RemoveContainer" containerID="1a3f1a063a0f591306ead18f7f9eb4f72ba127ba9a8eab4f177ba65a881cf158" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.547111 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.566764 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:23:58 crc kubenswrapper[4776]: E1208 09:23:58.567274 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84196301-9fa2-4acb-9a49-d87fdb571dfe" containerName="neutron-httpd" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.567294 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="84196301-9fa2-4acb-9a49-d87fdb571dfe" containerName="neutron-httpd" Dec 08 09:23:58 crc kubenswrapper[4776]: E1208 09:23:58.567303 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84196301-9fa2-4acb-9a49-d87fdb571dfe" containerName="neutron-api" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.567311 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="84196301-9fa2-4acb-9a49-d87fdb571dfe" containerName="neutron-api" Dec 08 09:23:58 crc kubenswrapper[4776]: E1208 09:23:58.567336 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerName="ceilometer-central-agent" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.567343 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerName="ceilometer-central-agent" Dec 08 09:23:58 crc kubenswrapper[4776]: E1208 09:23:58.567374 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerName="ceilometer-notification-agent" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.567380 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerName="ceilometer-notification-agent" Dec 08 09:23:58 crc kubenswrapper[4776]: E1208 09:23:58.567393 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerName="proxy-httpd" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.567399 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerName="proxy-httpd" Dec 08 09:23:58 crc kubenswrapper[4776]: E1208 09:23:58.567408 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerName="sg-core" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.567414 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerName="sg-core" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.567614 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerName="sg-core" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.567630 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="84196301-9fa2-4acb-9a49-d87fdb571dfe" containerName="neutron-api" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.567648 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerName="proxy-httpd" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.567664 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="84196301-9fa2-4acb-9a49-d87fdb571dfe" containerName="neutron-httpd" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.567675 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerName="ceilometer-central-agent" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.567687 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" containerName="ceilometer-notification-agent" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.569572 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.571727 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.574501 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.575490 4776 scope.go:117] "RemoveContainer" containerID="d7ba3e07bcf3270a0c6f2a6ee1806badf11ce50e5d636109f9f5dd2af97378ea" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.580164 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.611472 4776 scope.go:117] "RemoveContainer" containerID="dab52e6e5c2b5cc7b58c79164315a06f1a8dc3f148813955e44361b3d0e9b8c7" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.639226 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-config-data\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.639293 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-run-httpd\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.639312 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.639739 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trcxh\" (UniqueName: \"kubernetes.io/projected/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-kube-api-access-trcxh\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.639807 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.639838 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-log-httpd\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.640080 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-scripts\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.642025 4776 scope.go:117] "RemoveContainer" containerID="32afd0bfc8d7e6053ea28bb4cc93d2025748cb9ca639ba221f71097cc567be80" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.742280 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-run-httpd\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.742326 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.742437 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trcxh\" (UniqueName: \"kubernetes.io/projected/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-kube-api-access-trcxh\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.742464 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.742483 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-log-httpd\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.742524 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-scripts\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.742589 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-config-data\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.742782 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-run-httpd\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.742956 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-log-httpd\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.748779 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-config-data\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.749327 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.750846 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-scripts\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.754403 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.759194 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trcxh\" (UniqueName: \"kubernetes.io/projected/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-kube-api-access-trcxh\") pod \"ceilometer-0\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " pod="openstack/ceilometer-0" Dec 08 09:23:58 crc kubenswrapper[4776]: I1208 09:23:58.886218 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:23:59 crc kubenswrapper[4776]: I1208 09:23:59.386139 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:23:59 crc kubenswrapper[4776]: I1208 09:23:59.476882 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41881b6c-dcfb-4d67-ad0c-f0e003837c8e","Type":"ContainerStarted","Data":"9d91170523c7bc798339025e32cbdd6c44ea3babde4927847739101e04630e5b"} Dec 08 09:23:59 crc kubenswrapper[4776]: W1208 09:23:59.844989 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76bbd39f_d9a0_4cdc_a97d_67aa5345ca7e.slice/crio-32afd0bfc8d7e6053ea28bb4cc93d2025748cb9ca639ba221f71097cc567be80.scope WatchSource:0}: Error finding container 32afd0bfc8d7e6053ea28bb4cc93d2025748cb9ca639ba221f71097cc567be80: Status 404 returned error can't find the container with id 32afd0bfc8d7e6053ea28bb4cc93d2025748cb9ca639ba221f71097cc567be80 Dec 08 09:23:59 crc kubenswrapper[4776]: W1208 09:23:59.849283 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76bbd39f_d9a0_4cdc_a97d_67aa5345ca7e.slice/crio-dab52e6e5c2b5cc7b58c79164315a06f1a8dc3f148813955e44361b3d0e9b8c7.scope WatchSource:0}: Error finding container dab52e6e5c2b5cc7b58c79164315a06f1a8dc3f148813955e44361b3d0e9b8c7: Status 404 returned error can't find the container with id dab52e6e5c2b5cc7b58c79164315a06f1a8dc3f148813955e44361b3d0e9b8c7 Dec 08 09:23:59 crc kubenswrapper[4776]: W1208 09:23:59.850516 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76bbd39f_d9a0_4cdc_a97d_67aa5345ca7e.slice/crio-d7ba3e07bcf3270a0c6f2a6ee1806badf11ce50e5d636109f9f5dd2af97378ea.scope WatchSource:0}: Error finding container d7ba3e07bcf3270a0c6f2a6ee1806badf11ce50e5d636109f9f5dd2af97378ea: Status 404 returned error can't find the container with id d7ba3e07bcf3270a0c6f2a6ee1806badf11ce50e5d636109f9f5dd2af97378ea Dec 08 09:23:59 crc kubenswrapper[4776]: W1208 09:23:59.850733 4776 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d51b4c0_93a1_4af4_95ed_a9b169ebb077.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d51b4c0_93a1_4af4_95ed_a9b169ebb077.slice: no such file or directory Dec 08 09:23:59 crc kubenswrapper[4776]: W1208 09:23:59.851946 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76bbd39f_d9a0_4cdc_a97d_67aa5345ca7e.slice/crio-1a3f1a063a0f591306ead18f7f9eb4f72ba127ba9a8eab4f177ba65a881cf158.scope WatchSource:0}: Error finding container 1a3f1a063a0f591306ead18f7f9eb4f72ba127ba9a8eab4f177ba65a881cf158: Status 404 returned error can't find the container with id 1a3f1a063a0f591306ead18f7f9eb4f72ba127ba9a8eab4f177ba65a881cf158 Dec 08 09:23:59 crc kubenswrapper[4776]: E1208 09:23:59.947712 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001c9704_cf27_4a31_8a61_3e5ce2e272eb.slice/crio-81720cb0fb50ed1af9365af7b03959b40b1482af143ff9ec6760350a32cde2cd\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice/crio-daea7072fefcc23dac165f473d3f4cb1359947754ada16949fc11a82d8c6acac.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice/crio-conmon-adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice/crio-conmon-b6e086130db2fd5957987439f44ccd67cfe0b9761b9f5b4c3f382d70be798431.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76bbd39f_d9a0_4cdc_a97d_67aa5345ca7e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice/crio-b6e086130db2fd5957987439f44ccd67cfe0b9761b9f5b4c3f382d70be798431.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice/crio-64fc18eda76c24ac93cec50dfdf9324c0d1be82adbe44a3cfcf13acd8a740bb6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f88f901_e706_4892_aa2a_48a97c28a699.slice/crio-conmon-de627a7daaf25c6012f0a1601130543b8d18df127d0afe55ac58892054ed5967.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice/crio-conmon-7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice/crio-7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f88f901_e706_4892_aa2a_48a97c28a699.slice/crio-de627a7daaf25c6012f0a1601130543b8d18df127d0afe55ac58892054ed5967.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice/crio-conmon-daea7072fefcc23dac165f473d3f4cb1359947754ada16949fc11a82d8c6acac.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice/crio-adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001c9704_cf27_4a31_8a61_3e5ce2e272eb.slice/crio-6ac391ca9825822a2c218f1f30e3333d2154ab78fce7cfd6da43a7327afe0a0d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice/crio-fb41ff30a5752bdcb5a27fce5ea4df31b70151ed9494ed8e575dc6eb6ed93925\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001c9704_cf27_4a31_8a61_3e5ce2e272eb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001c9704_cf27_4a31_8a61_3e5ce2e272eb.slice/crio-conmon-6ac391ca9825822a2c218f1f30e3333d2154ab78fce7cfd6da43a7327afe0a0d.scope\": RecentStats: unable to find data in memory cache]" Dec 08 09:23:59 crc kubenswrapper[4776]: E1208 09:23:59.948361 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f88f901_e706_4892_aa2a_48a97c28a699.slice/crio-208f5cbf07a076a4511458bc65162a21690159a96f54085dc6f850c10989918f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76bbd39f_d9a0_4cdc_a97d_67aa5345ca7e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76bbd39f_d9a0_4cdc_a97d_67aa5345ca7e.slice/crio-99a6c242b2e92451488eab3c92eebe401be3f908bed1e1b3f1c5788a1a224095\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001c9704_cf27_4a31_8a61_3e5ce2e272eb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice/crio-conmon-7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001c9704_cf27_4a31_8a61_3e5ce2e272eb.slice/crio-6ac391ca9825822a2c218f1f30e3333d2154ab78fce7cfd6da43a7327afe0a0d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001c9704_cf27_4a31_8a61_3e5ce2e272eb.slice/crio-conmon-6ac391ca9825822a2c218f1f30e3333d2154ab78fce7cfd6da43a7327afe0a0d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice/crio-7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice/crio-daea7072fefcc23dac165f473d3f4cb1359947754ada16949fc11a82d8c6acac.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice/crio-adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice/crio-64fc18eda76c24ac93cec50dfdf9324c0d1be82adbe44a3cfcf13acd8a740bb6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice/crio-fb41ff30a5752bdcb5a27fce5ea4df31b70151ed9494ed8e575dc6eb6ed93925\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice/crio-conmon-adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice/crio-b6e086130db2fd5957987439f44ccd67cfe0b9761b9f5b4c3f382d70be798431.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f88f901_e706_4892_aa2a_48a97c28a699.slice/crio-conmon-de627a7daaf25c6012f0a1601130543b8d18df127d0afe55ac58892054ed5967.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice/crio-conmon-daea7072fefcc23dac165f473d3f4cb1359947754ada16949fc11a82d8c6acac.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice/crio-conmon-b6e086130db2fd5957987439f44ccd67cfe0b9761b9f5b4c3f382d70be798431.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f88f901_e706_4892_aa2a_48a97c28a699.slice/crio-de627a7daaf25c6012f0a1601130543b8d18df127d0afe55ac58892054ed5967.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001c9704_cf27_4a31_8a61_3e5ce2e272eb.slice/crio-81720cb0fb50ed1af9365af7b03959b40b1482af143ff9ec6760350a32cde2cd\": RecentStats: unable to find data in memory cache]" Dec 08 09:23:59 crc kubenswrapper[4776]: E1208 09:23:59.947779 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f88f901_e706_4892_aa2a_48a97c28a699.slice/crio-208f5cbf07a076a4511458bc65162a21690159a96f54085dc6f850c10989918f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice/crio-64fc18eda76c24ac93cec50dfdf9324c0d1be82adbe44a3cfcf13acd8a740bb6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f88f901_e706_4892_aa2a_48a97c28a699.slice/crio-conmon-de627a7daaf25c6012f0a1601130543b8d18df127d0afe55ac58892054ed5967.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice/crio-conmon-daea7072fefcc23dac165f473d3f4cb1359947754ada16949fc11a82d8c6acac.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice/crio-b6e086130db2fd5957987439f44ccd67cfe0b9761b9f5b4c3f382d70be798431.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001c9704_cf27_4a31_8a61_3e5ce2e272eb.slice/crio-conmon-6ac391ca9825822a2c218f1f30e3333d2154ab78fce7cfd6da43a7327afe0a0d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f88f901_e706_4892_aa2a_48a97c28a699.slice/crio-de627a7daaf25c6012f0a1601130543b8d18df127d0afe55ac58892054ed5967.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice/crio-fb41ff30a5752bdcb5a27fce5ea4df31b70151ed9494ed8e575dc6eb6ed93925\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice/crio-daea7072fefcc23dac165f473d3f4cb1359947754ada16949fc11a82d8c6acac.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice/crio-conmon-7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001c9704_cf27_4a31_8a61_3e5ce2e272eb.slice/crio-81720cb0fb50ed1af9365af7b03959b40b1482af143ff9ec6760350a32cde2cd\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76bbd39f_d9a0_4cdc_a97d_67aa5345ca7e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001c9704_cf27_4a31_8a61_3e5ce2e272eb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76bbd39f_d9a0_4cdc_a97d_67aa5345ca7e.slice/crio-99a6c242b2e92451488eab3c92eebe401be3f908bed1e1b3f1c5788a1a224095\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001c9704_cf27_4a31_8a61_3e5ce2e272eb.slice/crio-6ac391ca9825822a2c218f1f30e3333d2154ab78fce7cfd6da43a7327afe0a0d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice/crio-conmon-adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice/crio-7e118c7e59dda3b43e33829c1d67a5fa9cf831f05c4577a876715fe7efaf21b0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84196301_9fa2_4acb_9a49_d87fdb571dfe.slice/crio-conmon-b6e086130db2fd5957987439f44ccd67cfe0b9761b9f5b4c3f382d70be798431.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5472c33_1c77_4bde_a438_924aa6a53a78.slice/crio-adca6abc9e27046c0e2a0229c71f12d80d1562e6c9d2cc08662fd7a3b828e352.scope\": RecentStats: unable to find data in memory cache]" Dec 08 09:24:00 crc kubenswrapper[4776]: I1208 09:24:00.359230 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e" path="/var/lib/kubelet/pods/76bbd39f-d9a0-4cdc-a97d-67aa5345ca7e/volumes" Dec 08 09:24:00 crc kubenswrapper[4776]: I1208 09:24:00.362025 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84196301-9fa2-4acb-9a49-d87fdb571dfe" path="/var/lib/kubelet/pods/84196301-9fa2-4acb-9a49-d87fdb571dfe/volumes" Dec 08 09:24:00 crc kubenswrapper[4776]: I1208 09:24:00.495419 4776 generic.go:334] "Generic (PLEG): container finished" podID="8f88f901-e706-4892-aa2a-48a97c28a699" containerID="de627a7daaf25c6012f0a1601130543b8d18df127d0afe55ac58892054ed5967" exitCode=137 Dec 08 09:24:00 crc kubenswrapper[4776]: I1208 09:24:00.495503 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f88f901-e706-4892-aa2a-48a97c28a699","Type":"ContainerDied","Data":"de627a7daaf25c6012f0a1601130543b8d18df127d0afe55ac58892054ed5967"} Dec 08 09:24:01 crc kubenswrapper[4776]: I1208 09:24:01.429230 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="8f88f901-e706-4892-aa2a-48a97c28a699" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.193:8776/healthcheck\": dial tcp 10.217.0.193:8776: connect: connection refused" Dec 08 09:24:01 crc kubenswrapper[4776]: I1208 09:24:01.499199 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:24:02 crc kubenswrapper[4776]: I1208 09:24:02.508635 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75876fb99b-xnbd7" Dec 08 09:24:03 crc kubenswrapper[4776]: I1208 09:24:03.746579 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:24:03 crc kubenswrapper[4776]: I1208 09:24:03.748269 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" Dec 08 09:24:04 crc kubenswrapper[4776]: I1208 09:24:04.296443 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.083381 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.177736 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-config-data-custom\") pod \"8f88f901-e706-4892-aa2a-48a97c28a699\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.177788 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5lgb\" (UniqueName: \"kubernetes.io/projected/8f88f901-e706-4892-aa2a-48a97c28a699-kube-api-access-q5lgb\") pod \"8f88f901-e706-4892-aa2a-48a97c28a699\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.177845 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-combined-ca-bundle\") pod \"8f88f901-e706-4892-aa2a-48a97c28a699\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.177875 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f88f901-e706-4892-aa2a-48a97c28a699-etc-machine-id\") pod \"8f88f901-e706-4892-aa2a-48a97c28a699\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.177993 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-config-data\") pod \"8f88f901-e706-4892-aa2a-48a97c28a699\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.178063 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-scripts\") pod \"8f88f901-e706-4892-aa2a-48a97c28a699\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.178110 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f88f901-e706-4892-aa2a-48a97c28a699-logs\") pod \"8f88f901-e706-4892-aa2a-48a97c28a699\" (UID: \"8f88f901-e706-4892-aa2a-48a97c28a699\") " Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.179264 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f88f901-e706-4892-aa2a-48a97c28a699-logs" (OuterVolumeSpecName: "logs") pod "8f88f901-e706-4892-aa2a-48a97c28a699" (UID: "8f88f901-e706-4892-aa2a-48a97c28a699"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.180403 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f88f901-e706-4892-aa2a-48a97c28a699-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8f88f901-e706-4892-aa2a-48a97c28a699" (UID: "8f88f901-e706-4892-aa2a-48a97c28a699"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.185666 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8f88f901-e706-4892-aa2a-48a97c28a699" (UID: "8f88f901-e706-4892-aa2a-48a97c28a699"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.185974 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f88f901-e706-4892-aa2a-48a97c28a699-kube-api-access-q5lgb" (OuterVolumeSpecName: "kube-api-access-q5lgb") pod "8f88f901-e706-4892-aa2a-48a97c28a699" (UID: "8f88f901-e706-4892-aa2a-48a97c28a699"). InnerVolumeSpecName "kube-api-access-q5lgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.187376 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-scripts" (OuterVolumeSpecName: "scripts") pod "8f88f901-e706-4892-aa2a-48a97c28a699" (UID: "8f88f901-e706-4892-aa2a-48a97c28a699"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.230387 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f88f901-e706-4892-aa2a-48a97c28a699" (UID: "8f88f901-e706-4892-aa2a-48a97c28a699"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.250688 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-config-data" (OuterVolumeSpecName: "config-data") pod "8f88f901-e706-4892-aa2a-48a97c28a699" (UID: "8f88f901-e706-4892-aa2a-48a97c28a699"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.281499 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.281530 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f88f901-e706-4892-aa2a-48a97c28a699-logs\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.281540 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.281551 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5lgb\" (UniqueName: \"kubernetes.io/projected/8f88f901-e706-4892-aa2a-48a97c28a699-kube-api-access-q5lgb\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.281561 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.281569 4776 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f88f901-e706-4892-aa2a-48a97c28a699-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.281576 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f88f901-e706-4892-aa2a-48a97c28a699-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.608771 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.608773 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f88f901-e706-4892-aa2a-48a97c28a699","Type":"ContainerDied","Data":"208f5cbf07a076a4511458bc65162a21690159a96f54085dc6f850c10989918f"} Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.608978 4776 scope.go:117] "RemoveContainer" containerID="de627a7daaf25c6012f0a1601130543b8d18df127d0afe55ac58892054ed5967" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.610709 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8606b034-7364-4dce-bea0-7c0e2067ee95","Type":"ContainerStarted","Data":"0c887ed0b57bf87af70caeae8133775d25462730fb2cee828ecba2d25efec7e9"} Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.613918 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41881b6c-dcfb-4d67-ad0c-f0e003837c8e","Type":"ContainerStarted","Data":"ca7646b3e4cd7851105e1b476bc4c75e76c7bad3c112fbae7638b166ccb69971"} Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.636535 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.204768818 podStartE2EDuration="12.636515404s" podCreationTimestamp="2025-12-08 09:23:54 +0000 UTC" firstStartedPulling="2025-12-08 09:23:55.346607683 +0000 UTC m=+1511.609832705" lastFinishedPulling="2025-12-08 09:24:05.778354259 +0000 UTC m=+1522.041579291" observedRunningTime="2025-12-08 09:24:06.635277382 +0000 UTC m=+1522.898502404" watchObservedRunningTime="2025-12-08 09:24:06.636515404 +0000 UTC m=+1522.899740426" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.653386 4776 scope.go:117] "RemoveContainer" containerID="c22380e48b4e8549b299b571cd0e4721ac50057cdad3d69c654f575cdd1eb07e" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.659562 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.673375 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.697983 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 08 09:24:06 crc kubenswrapper[4776]: E1208 09:24:06.707488 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f88f901-e706-4892-aa2a-48a97c28a699" containerName="cinder-api" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.708605 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f88f901-e706-4892-aa2a-48a97c28a699" containerName="cinder-api" Dec 08 09:24:06 crc kubenswrapper[4776]: E1208 09:24:06.708695 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f88f901-e706-4892-aa2a-48a97c28a699" containerName="cinder-api-log" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.708899 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f88f901-e706-4892-aa2a-48a97c28a699" containerName="cinder-api-log" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.709812 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f88f901-e706-4892-aa2a-48a97c28a699" containerName="cinder-api" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.710416 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f88f901-e706-4892-aa2a-48a97c28a699" containerName="cinder-api-log" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.717544 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.717787 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.721784 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.722394 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.722526 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.795855 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-config-data-custom\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.795966 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.795998 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-scripts\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.796026 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.796053 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-config-data\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.796108 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.796227 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t28z9\" (UniqueName: \"kubernetes.io/projected/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-kube-api-access-t28z9\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.796285 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.796316 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-logs\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.898118 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-logs\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.898548 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-config-data-custom\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.898748 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.898864 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-scripts\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.898988 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.899088 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-config-data\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.899210 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.899383 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t28z9\" (UniqueName: \"kubernetes.io/projected/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-kube-api-access-t28z9\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.899545 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.899677 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.898760 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-logs\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.903898 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-config-data-custom\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.904377 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-scripts\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.905125 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.909757 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.919948 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.920245 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-config-data\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:06 crc kubenswrapper[4776]: I1208 09:24:06.923754 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t28z9\" (UniqueName: \"kubernetes.io/projected/dcb1d701-bc05-4d4b-8794-ebc4af6da8ba-kube-api-access-t28z9\") pod \"cinder-api-0\" (UID: \"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba\") " pod="openstack/cinder-api-0" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.075829 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.127576 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-55647645f8-9xvpq"] Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.129029 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.132813 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.133196 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.133355 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-srzvg" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.144836 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-55647645f8-9xvpq"] Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.205931 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-config-data-custom\") pod \"heat-engine-55647645f8-9xvpq\" (UID: \"0cea35b3-0412-490a-9d71-2c5e10e85c51\") " pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.206004 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-combined-ca-bundle\") pod \"heat-engine-55647645f8-9xvpq\" (UID: \"0cea35b3-0412-490a-9d71-2c5e10e85c51\") " pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.206089 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b4jh\" (UniqueName: \"kubernetes.io/projected/0cea35b3-0412-490a-9d71-2c5e10e85c51-kube-api-access-5b4jh\") pod \"heat-engine-55647645f8-9xvpq\" (UID: \"0cea35b3-0412-490a-9d71-2c5e10e85c51\") " pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.206188 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-config-data\") pod \"heat-engine-55647645f8-9xvpq\" (UID: \"0cea35b3-0412-490a-9d71-2c5e10e85c51\") " pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.298961 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-788bcdcb6b-kpzht"] Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.300723 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.318479 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.319256 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-config-data\") pod \"heat-engine-55647645f8-9xvpq\" (UID: \"0cea35b3-0412-490a-9d71-2c5e10e85c51\") " pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.319325 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-config-data-custom\") pod \"heat-engine-55647645f8-9xvpq\" (UID: \"0cea35b3-0412-490a-9d71-2c5e10e85c51\") " pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.319368 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-combined-ca-bundle\") pod \"heat-engine-55647645f8-9xvpq\" (UID: \"0cea35b3-0412-490a-9d71-2c5e10e85c51\") " pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.319446 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b4jh\" (UniqueName: \"kubernetes.io/projected/0cea35b3-0412-490a-9d71-2c5e10e85c51-kube-api-access-5b4jh\") pod \"heat-engine-55647645f8-9xvpq\" (UID: \"0cea35b3-0412-490a-9d71-2c5e10e85c51\") " pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.326973 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-config-data\") pod \"heat-engine-55647645f8-9xvpq\" (UID: \"0cea35b3-0412-490a-9d71-2c5e10e85c51\") " pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.329480 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-config-data-custom\") pod \"heat-engine-55647645f8-9xvpq\" (UID: \"0cea35b3-0412-490a-9d71-2c5e10e85c51\") " pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.331159 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-combined-ca-bundle\") pod \"heat-engine-55647645f8-9xvpq\" (UID: \"0cea35b3-0412-490a-9d71-2c5e10e85c51\") " pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.343364 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-mj9ps"] Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.345463 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.424476 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b4jh\" (UniqueName: \"kubernetes.io/projected/0cea35b3-0412-490a-9d71-2c5e10e85c51-kube-api-access-5b4jh\") pod \"heat-engine-55647645f8-9xvpq\" (UID: \"0cea35b3-0412-490a-9d71-2c5e10e85c51\") " pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.438640 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.438729 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-config-data-custom\") pod \"heat-cfnapi-788bcdcb6b-kpzht\" (UID: \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\") " pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.440262 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.440745 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-config\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.440801 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgsgl\" (UniqueName: \"kubernetes.io/projected/dc053d70-b785-4b45-91be-49cbd27952d9-kube-api-access-kgsgl\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.440866 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.440901 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-combined-ca-bundle\") pod \"heat-cfnapi-788bcdcb6b-kpzht\" (UID: \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\") " pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.441006 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b7vf\" (UniqueName: \"kubernetes.io/projected/39ad8a82-0a3f-4f21-bf0f-a158bd903618-kube-api-access-2b7vf\") pod \"heat-cfnapi-788bcdcb6b-kpzht\" (UID: \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\") " pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.441052 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-config-data\") pod \"heat-cfnapi-788bcdcb6b-kpzht\" (UID: \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\") " pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.441082 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.451670 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-mj9ps"] Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.472367 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-788bcdcb6b-kpzht"] Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.491047 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5577b84758-k9tb2"] Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.494260 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.499091 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.528006 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.530757 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5577b84758-k9tb2"] Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.543050 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.543183 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-config-data-custom\") pod \"heat-cfnapi-788bcdcb6b-kpzht\" (UID: \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\") " pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.543268 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.543357 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-config\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.543497 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgsgl\" (UniqueName: \"kubernetes.io/projected/dc053d70-b785-4b45-91be-49cbd27952d9-kube-api-access-kgsgl\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.543644 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.543756 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-combined-ca-bundle\") pod \"heat-cfnapi-788bcdcb6b-kpzht\" (UID: \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\") " pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.543977 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b7vf\" (UniqueName: \"kubernetes.io/projected/39ad8a82-0a3f-4f21-bf0f-a158bd903618-kube-api-access-2b7vf\") pod \"heat-cfnapi-788bcdcb6b-kpzht\" (UID: \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\") " pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.544130 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-config-data\") pod \"heat-cfnapi-788bcdcb6b-kpzht\" (UID: \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\") " pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.544290 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.545787 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.546418 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.548993 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.549957 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-config\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.551582 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.559059 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-combined-ca-bundle\") pod \"heat-cfnapi-788bcdcb6b-kpzht\" (UID: \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\") " pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.559828 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-config-data-custom\") pod \"heat-cfnapi-788bcdcb6b-kpzht\" (UID: \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\") " pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.560489 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-config-data\") pod \"heat-cfnapi-788bcdcb6b-kpzht\" (UID: \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\") " pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.572568 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b7vf\" (UniqueName: \"kubernetes.io/projected/39ad8a82-0a3f-4f21-bf0f-a158bd903618-kube-api-access-2b7vf\") pod \"heat-cfnapi-788bcdcb6b-kpzht\" (UID: \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\") " pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.577441 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgsgl\" (UniqueName: \"kubernetes.io/projected/dc053d70-b785-4b45-91be-49cbd27952d9-kube-api-access-kgsgl\") pod \"dnsmasq-dns-7756b9d78c-mj9ps\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.649875 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-config-data-custom\") pod \"heat-api-5577b84758-k9tb2\" (UID: \"c99fb0c4-9b67-42b9-87d6-13ae72903740\") " pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.649975 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-config-data\") pod \"heat-api-5577b84758-k9tb2\" (UID: \"c99fb0c4-9b67-42b9-87d6-13ae72903740\") " pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.651285 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdvrk\" (UniqueName: \"kubernetes.io/projected/c99fb0c4-9b67-42b9-87d6-13ae72903740-kube-api-access-zdvrk\") pod \"heat-api-5577b84758-k9tb2\" (UID: \"c99fb0c4-9b67-42b9-87d6-13ae72903740\") " pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.651369 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-combined-ca-bundle\") pod \"heat-api-5577b84758-k9tb2\" (UID: \"c99fb0c4-9b67-42b9-87d6-13ae72903740\") " pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.678327 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41881b6c-dcfb-4d67-ad0c-f0e003837c8e","Type":"ContainerStarted","Data":"91642be302f5531e66f9c1e5ce36663bb8a9a07641f46cc0e66d481153bc4238"} Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.678387 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41881b6c-dcfb-4d67-ad0c-f0e003837c8e","Type":"ContainerStarted","Data":"2170701a28be7b605cdcde1013dd35b1fa0ea062119d4370a269937c5c0b27ec"} Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.727744 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.744281 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.744583 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" containerName="glance-log" containerID="cri-o://66a083d4a17692f828c222f667901a8c63cbc10353a50fcabcdc1b39b5aadae0" gracePeriod=30 Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.745045 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" containerName="glance-httpd" containerID="cri-o://ee9ecef0b47f7d54e0dbe04ffcfe752ccb4a3af5961fd3824353e635925dcff3" gracePeriod=30 Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.757202 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-config-data-custom\") pod \"heat-api-5577b84758-k9tb2\" (UID: \"c99fb0c4-9b67-42b9-87d6-13ae72903740\") " pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.757302 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-config-data\") pod \"heat-api-5577b84758-k9tb2\" (UID: \"c99fb0c4-9b67-42b9-87d6-13ae72903740\") " pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.757371 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdvrk\" (UniqueName: \"kubernetes.io/projected/c99fb0c4-9b67-42b9-87d6-13ae72903740-kube-api-access-zdvrk\") pod \"heat-api-5577b84758-k9tb2\" (UID: \"c99fb0c4-9b67-42b9-87d6-13ae72903740\") " pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.757425 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-combined-ca-bundle\") pod \"heat-api-5577b84758-k9tb2\" (UID: \"c99fb0c4-9b67-42b9-87d6-13ae72903740\") " pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.757796 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.782107 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-combined-ca-bundle\") pod \"heat-api-5577b84758-k9tb2\" (UID: \"c99fb0c4-9b67-42b9-87d6-13ae72903740\") " pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.784264 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-config-data\") pod \"heat-api-5577b84758-k9tb2\" (UID: \"c99fb0c4-9b67-42b9-87d6-13ae72903740\") " pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.793425 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdvrk\" (UniqueName: \"kubernetes.io/projected/c99fb0c4-9b67-42b9-87d6-13ae72903740-kube-api-access-zdvrk\") pod \"heat-api-5577b84758-k9tb2\" (UID: \"c99fb0c4-9b67-42b9-87d6-13ae72903740\") " pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.825235 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-config-data-custom\") pod \"heat-api-5577b84758-k9tb2\" (UID: \"c99fb0c4-9b67-42b9-87d6-13ae72903740\") " pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.840034 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:07 crc kubenswrapper[4776]: I1208 09:24:07.916347 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 08 09:24:08 crc kubenswrapper[4776]: I1208 09:24:08.121576 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-55647645f8-9xvpq"] Dec 08 09:24:08 crc kubenswrapper[4776]: I1208 09:24:08.388408 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f88f901-e706-4892-aa2a-48a97c28a699" path="/var/lib/kubelet/pods/8f88f901-e706-4892-aa2a-48a97c28a699/volumes" Dec 08 09:24:08 crc kubenswrapper[4776]: I1208 09:24:08.390082 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-788bcdcb6b-kpzht"] Dec 08 09:24:08 crc kubenswrapper[4776]: I1208 09:24:08.594104 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-mj9ps"] Dec 08 09:24:08 crc kubenswrapper[4776]: I1208 09:24:08.727701 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5577b84758-k9tb2"] Dec 08 09:24:08 crc kubenswrapper[4776]: I1208 09:24:08.748938 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" event={"ID":"dc053d70-b785-4b45-91be-49cbd27952d9","Type":"ContainerStarted","Data":"8939b4776c486ec837a2d88d9d011a3784a3ebce21c71b3c742e0a23388732a5"} Dec 08 09:24:08 crc kubenswrapper[4776]: I1208 09:24:08.754638 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-55647645f8-9xvpq" event={"ID":"0cea35b3-0412-490a-9d71-2c5e10e85c51","Type":"ContainerStarted","Data":"25231c45a67c8f3c3d317e3772e895cd938a660b465520775f6d63b54d0600fd"} Dec 08 09:24:08 crc kubenswrapper[4776]: I1208 09:24:08.754744 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-55647645f8-9xvpq" event={"ID":"0cea35b3-0412-490a-9d71-2c5e10e85c51","Type":"ContainerStarted","Data":"911edfa85a60dbb496cc30806f6261f958fa9aea33c46967c8deb243782a981c"} Dec 08 09:24:08 crc kubenswrapper[4776]: I1208 09:24:08.756001 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:08 crc kubenswrapper[4776]: I1208 09:24:08.777045 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" event={"ID":"39ad8a82-0a3f-4f21-bf0f-a158bd903618","Type":"ContainerStarted","Data":"5eaae633aa8440750387eadc0f0b2be7cb8e80657502c2c036b055bea45ab711"} Dec 08 09:24:08 crc kubenswrapper[4776]: I1208 09:24:08.785353 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-55647645f8-9xvpq" podStartSLOduration=1.7853097789999999 podStartE2EDuration="1.785309779s" podCreationTimestamp="2025-12-08 09:24:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:08.78497238 +0000 UTC m=+1525.048197412" watchObservedRunningTime="2025-12-08 09:24:08.785309779 +0000 UTC m=+1525.048534801" Dec 08 09:24:08 crc kubenswrapper[4776]: I1208 09:24:08.785783 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba","Type":"ContainerStarted","Data":"c9054c46390c0deacc47790932c3097f18c076d39ffd8266f819b7b145d3497a"} Dec 08 09:24:08 crc kubenswrapper[4776]: I1208 09:24:08.823533 4776 generic.go:334] "Generic (PLEG): container finished" podID="8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" containerID="66a083d4a17692f828c222f667901a8c63cbc10353a50fcabcdc1b39b5aadae0" exitCode=143 Dec 08 09:24:08 crc kubenswrapper[4776]: I1208 09:24:08.823606 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0","Type":"ContainerDied","Data":"66a083d4a17692f828c222f667901a8c63cbc10353a50fcabcdc1b39b5aadae0"} Dec 08 09:24:09 crc kubenswrapper[4776]: I1208 09:24:09.856426 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5577b84758-k9tb2" event={"ID":"c99fb0c4-9b67-42b9-87d6-13ae72903740","Type":"ContainerStarted","Data":"d8b2ab21242d1c0c73e5111cbcf771aaa60ed8e73381314b307ef97da9c746c5"} Dec 08 09:24:09 crc kubenswrapper[4776]: I1208 09:24:09.866461 4776 generic.go:334] "Generic (PLEG): container finished" podID="dc053d70-b785-4b45-91be-49cbd27952d9" containerID="8682930b1b479cacb27372ed25dc1d31db5c639561dc0106988ecdc8f469a90d" exitCode=0 Dec 08 09:24:09 crc kubenswrapper[4776]: I1208 09:24:09.867074 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" event={"ID":"dc053d70-b785-4b45-91be-49cbd27952d9","Type":"ContainerDied","Data":"8682930b1b479cacb27372ed25dc1d31db5c639561dc0106988ecdc8f469a90d"} Dec 08 09:24:09 crc kubenswrapper[4776]: I1208 09:24:09.891383 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba","Type":"ContainerStarted","Data":"2e42ac178200c2545e908d1f5d59ef889387c6e2c04f236fb6fbd3751550552b"} Dec 08 09:24:09 crc kubenswrapper[4776]: I1208 09:24:09.898587 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerName="ceilometer-central-agent" containerID="cri-o://ca7646b3e4cd7851105e1b476bc4c75e76c7bad3c112fbae7638b166ccb69971" gracePeriod=30 Dec 08 09:24:09 crc kubenswrapper[4776]: I1208 09:24:09.899126 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41881b6c-dcfb-4d67-ad0c-f0e003837c8e","Type":"ContainerStarted","Data":"bc6b0c69d05e7a2d352f35926981925451c42706a2be9d9182d422c4ef4f2cd5"} Dec 08 09:24:09 crc kubenswrapper[4776]: I1208 09:24:09.899267 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerName="proxy-httpd" containerID="cri-o://bc6b0c69d05e7a2d352f35926981925451c42706a2be9d9182d422c4ef4f2cd5" gracePeriod=30 Dec 08 09:24:09 crc kubenswrapper[4776]: I1208 09:24:09.899378 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerName="sg-core" containerID="cri-o://91642be302f5531e66f9c1e5ce36663bb8a9a07641f46cc0e66d481153bc4238" gracePeriod=30 Dec 08 09:24:09 crc kubenswrapper[4776]: I1208 09:24:09.899535 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerName="ceilometer-notification-agent" containerID="cri-o://2170701a28be7b605cdcde1013dd35b1fa0ea062119d4370a269937c5c0b27ec" gracePeriod=30 Dec 08 09:24:09 crc kubenswrapper[4776]: I1208 09:24:09.899294 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 09:24:09 crc kubenswrapper[4776]: I1208 09:24:09.951051 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.488249031 podStartE2EDuration="11.951032098s" podCreationTimestamp="2025-12-08 09:23:58 +0000 UTC" firstStartedPulling="2025-12-08 09:23:59.386886229 +0000 UTC m=+1515.650111251" lastFinishedPulling="2025-12-08 09:24:08.849669296 +0000 UTC m=+1525.112894318" observedRunningTime="2025-12-08 09:24:09.934708249 +0000 UTC m=+1526.197933271" watchObservedRunningTime="2025-12-08 09:24:09.951032098 +0000 UTC m=+1526.214257120" Dec 08 09:24:10 crc kubenswrapper[4776]: E1208 09:24:10.339367 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41881b6c_dcfb_4d67_ad0c_f0e003837c8e.slice/crio-bc6b0c69d05e7a2d352f35926981925451c42706a2be9d9182d422c4ef4f2cd5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41881b6c_dcfb_4d67_ad0c_f0e003837c8e.slice/crio-2170701a28be7b605cdcde1013dd35b1fa0ea062119d4370a269937c5c0b27ec.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41881b6c_dcfb_4d67_ad0c_f0e003837c8e.slice/crio-conmon-bc6b0c69d05e7a2d352f35926981925451c42706a2be9d9182d422c4ef4f2cd5.scope\": RecentStats: unable to find data in memory cache]" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.476571 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-nngf6"] Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.479753 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nngf6" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.514545 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-nngf6"] Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.591496 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7frr6\" (UniqueName: \"kubernetes.io/projected/5d642614-8b52-4d92-ae93-d281a37339df-kube-api-access-7frr6\") pod \"nova-api-db-create-nngf6\" (UID: \"5d642614-8b52-4d92-ae93-d281a37339df\") " pod="openstack/nova-api-db-create-nngf6" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.591647 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d642614-8b52-4d92-ae93-d281a37339df-operator-scripts\") pod \"nova-api-db-create-nngf6\" (UID: \"5d642614-8b52-4d92-ae93-d281a37339df\") " pod="openstack/nova-api-db-create-nngf6" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.683611 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-98lsp"] Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.685103 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-98lsp" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.694710 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d642614-8b52-4d92-ae93-d281a37339df-operator-scripts\") pod \"nova-api-db-create-nngf6\" (UID: \"5d642614-8b52-4d92-ae93-d281a37339df\") " pod="openstack/nova-api-db-create-nngf6" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.695022 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7frr6\" (UniqueName: \"kubernetes.io/projected/5d642614-8b52-4d92-ae93-d281a37339df-kube-api-access-7frr6\") pod \"nova-api-db-create-nngf6\" (UID: \"5d642614-8b52-4d92-ae93-d281a37339df\") " pod="openstack/nova-api-db-create-nngf6" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.695518 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d642614-8b52-4d92-ae93-d281a37339df-operator-scripts\") pod \"nova-api-db-create-nngf6\" (UID: \"5d642614-8b52-4d92-ae93-d281a37339df\") " pod="openstack/nova-api-db-create-nngf6" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.723333 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-98lsp"] Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.740031 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7frr6\" (UniqueName: \"kubernetes.io/projected/5d642614-8b52-4d92-ae93-d281a37339df-kube-api-access-7frr6\") pod \"nova-api-db-create-nngf6\" (UID: \"5d642614-8b52-4d92-ae93-d281a37339df\") " pod="openstack/nova-api-db-create-nngf6" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.798683 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/694530fd-2851-4a16-b642-071a7fca1ec0-operator-scripts\") pod \"nova-cell0-db-create-98lsp\" (UID: \"694530fd-2851-4a16-b642-071a7fca1ec0\") " pod="openstack/nova-cell0-db-create-98lsp" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.798738 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d5gs\" (UniqueName: \"kubernetes.io/projected/694530fd-2851-4a16-b642-071a7fca1ec0-kube-api-access-5d5gs\") pod \"nova-cell0-db-create-98lsp\" (UID: \"694530fd-2851-4a16-b642-071a7fca1ec0\") " pod="openstack/nova-cell0-db-create-98lsp" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.806389 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nngf6" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.832284 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1a23-account-create-update-cldmj"] Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.834507 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1a23-account-create-update-cldmj" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.841764 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.911572 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/694530fd-2851-4a16-b642-071a7fca1ec0-operator-scripts\") pod \"nova-cell0-db-create-98lsp\" (UID: \"694530fd-2851-4a16-b642-071a7fca1ec0\") " pod="openstack/nova-cell0-db-create-98lsp" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.911622 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d5gs\" (UniqueName: \"kubernetes.io/projected/694530fd-2851-4a16-b642-071a7fca1ec0-kube-api-access-5d5gs\") pod \"nova-cell0-db-create-98lsp\" (UID: \"694530fd-2851-4a16-b642-071a7fca1ec0\") " pod="openstack/nova-cell0-db-create-98lsp" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.912718 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/694530fd-2851-4a16-b642-071a7fca1ec0-operator-scripts\") pod \"nova-cell0-db-create-98lsp\" (UID: \"694530fd-2851-4a16-b642-071a7fca1ec0\") " pod="openstack/nova-cell0-db-create-98lsp" Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.965069 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1a23-account-create-update-cldmj"] Dec 08 09:24:10 crc kubenswrapper[4776]: I1208 09:24:10.983866 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d5gs\" (UniqueName: \"kubernetes.io/projected/694530fd-2851-4a16-b642-071a7fca1ec0-kube-api-access-5d5gs\") pod \"nova-cell0-db-create-98lsp\" (UID: \"694530fd-2851-4a16-b642-071a7fca1ec0\") " pod="openstack/nova-cell0-db-create-98lsp" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.006868 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-98lsp" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.011662 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-jr6cc"] Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.012958 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9drl4\" (UniqueName: \"kubernetes.io/projected/f6727a0d-5792-4bf3-9d9b-a84ad470ba82-kube-api-access-9drl4\") pod \"nova-api-1a23-account-create-update-cldmj\" (UID: \"f6727a0d-5792-4bf3-9d9b-a84ad470ba82\") " pod="openstack/nova-api-1a23-account-create-update-cldmj" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.013319 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jr6cc" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.013629 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6727a0d-5792-4bf3-9d9b-a84ad470ba82-operator-scripts\") pod \"nova-api-1a23-account-create-update-cldmj\" (UID: \"f6727a0d-5792-4bf3-9d9b-a84ad470ba82\") " pod="openstack/nova-api-1a23-account-create-update-cldmj" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.051980 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jr6cc"] Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.055599 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dcb1d701-bc05-4d4b-8794-ebc4af6da8ba","Type":"ContainerStarted","Data":"14177fd47f3feec55290586d18b306c69d43fa374d784aa4045dbbb2c9c48f9e"} Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.057004 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.081040 4776 generic.go:334] "Generic (PLEG): container finished" podID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerID="bc6b0c69d05e7a2d352f35926981925451c42706a2be9d9182d422c4ef4f2cd5" exitCode=0 Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.081085 4776 generic.go:334] "Generic (PLEG): container finished" podID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerID="91642be302f5531e66f9c1e5ce36663bb8a9a07641f46cc0e66d481153bc4238" exitCode=2 Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.081095 4776 generic.go:334] "Generic (PLEG): container finished" podID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerID="2170701a28be7b605cdcde1013dd35b1fa0ea062119d4370a269937c5c0b27ec" exitCode=0 Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.084166 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41881b6c-dcfb-4d67-ad0c-f0e003837c8e","Type":"ContainerDied","Data":"bc6b0c69d05e7a2d352f35926981925451c42706a2be9d9182d422c4ef4f2cd5"} Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.084236 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41881b6c-dcfb-4d67-ad0c-f0e003837c8e","Type":"ContainerDied","Data":"91642be302f5531e66f9c1e5ce36663bb8a9a07641f46cc0e66d481153bc4238"} Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.084250 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41881b6c-dcfb-4d67-ad0c-f0e003837c8e","Type":"ContainerDied","Data":"2170701a28be7b605cdcde1013dd35b1fa0ea062119d4370a269937c5c0b27ec"} Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.089239 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-45e9-account-create-update-8lsk4"] Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.091432 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-45e9-account-create-update-8lsk4" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.093907 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.103318 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-45e9-account-create-update-8lsk4"] Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.116365 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6727a0d-5792-4bf3-9d9b-a84ad470ba82-operator-scripts\") pod \"nova-api-1a23-account-create-update-cldmj\" (UID: \"f6727a0d-5792-4bf3-9d9b-a84ad470ba82\") " pod="openstack/nova-api-1a23-account-create-update-cldmj" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.116418 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfxxh\" (UniqueName: \"kubernetes.io/projected/349ea677-2d9d-4506-b515-d5b03946ea88-kube-api-access-zfxxh\") pod \"nova-cell1-db-create-jr6cc\" (UID: \"349ea677-2d9d-4506-b515-d5b03946ea88\") " pod="openstack/nova-cell1-db-create-jr6cc" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.116496 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9drl4\" (UniqueName: \"kubernetes.io/projected/f6727a0d-5792-4bf3-9d9b-a84ad470ba82-kube-api-access-9drl4\") pod \"nova-api-1a23-account-create-update-cldmj\" (UID: \"f6727a0d-5792-4bf3-9d9b-a84ad470ba82\") " pod="openstack/nova-api-1a23-account-create-update-cldmj" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.116619 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/349ea677-2d9d-4506-b515-d5b03946ea88-operator-scripts\") pod \"nova-cell1-db-create-jr6cc\" (UID: \"349ea677-2d9d-4506-b515-d5b03946ea88\") " pod="openstack/nova-cell1-db-create-jr6cc" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.117828 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6727a0d-5792-4bf3-9d9b-a84ad470ba82-operator-scripts\") pod \"nova-api-1a23-account-create-update-cldmj\" (UID: \"f6727a0d-5792-4bf3-9d9b-a84ad470ba82\") " pod="openstack/nova-api-1a23-account-create-update-cldmj" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.123377 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.123359352 podStartE2EDuration="5.123359352s" podCreationTimestamp="2025-12-08 09:24:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:11.084077938 +0000 UTC m=+1527.347302960" watchObservedRunningTime="2025-12-08 09:24:11.123359352 +0000 UTC m=+1527.386584374" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.144283 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9drl4\" (UniqueName: \"kubernetes.io/projected/f6727a0d-5792-4bf3-9d9b-a84ad470ba82-kube-api-access-9drl4\") pod \"nova-api-1a23-account-create-update-cldmj\" (UID: \"f6727a0d-5792-4bf3-9d9b-a84ad470ba82\") " pod="openstack/nova-api-1a23-account-create-update-cldmj" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.196363 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-60a1-account-create-update-6j76b"] Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.197778 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-60a1-account-create-update-6j76b" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.201255 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.212353 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-60a1-account-create-update-6j76b"] Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.219509 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6923b91-d6e2-4672-be62-2531342086e1-operator-scripts\") pod \"nova-cell0-45e9-account-create-update-8lsk4\" (UID: \"c6923b91-d6e2-4672-be62-2531342086e1\") " pod="openstack/nova-cell0-45e9-account-create-update-8lsk4" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.219598 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfxxh\" (UniqueName: \"kubernetes.io/projected/349ea677-2d9d-4506-b515-d5b03946ea88-kube-api-access-zfxxh\") pod \"nova-cell1-db-create-jr6cc\" (UID: \"349ea677-2d9d-4506-b515-d5b03946ea88\") " pod="openstack/nova-cell1-db-create-jr6cc" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.219730 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z25qg\" (UniqueName: \"kubernetes.io/projected/c6923b91-d6e2-4672-be62-2531342086e1-kube-api-access-z25qg\") pod \"nova-cell0-45e9-account-create-update-8lsk4\" (UID: \"c6923b91-d6e2-4672-be62-2531342086e1\") " pod="openstack/nova-cell0-45e9-account-create-update-8lsk4" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.219865 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/349ea677-2d9d-4506-b515-d5b03946ea88-operator-scripts\") pod \"nova-cell1-db-create-jr6cc\" (UID: \"349ea677-2d9d-4506-b515-d5b03946ea88\") " pod="openstack/nova-cell1-db-create-jr6cc" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.223685 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/349ea677-2d9d-4506-b515-d5b03946ea88-operator-scripts\") pod \"nova-cell1-db-create-jr6cc\" (UID: \"349ea677-2d9d-4506-b515-d5b03946ea88\") " pod="openstack/nova-cell1-db-create-jr6cc" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.242054 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfxxh\" (UniqueName: \"kubernetes.io/projected/349ea677-2d9d-4506-b515-d5b03946ea88-kube-api-access-zfxxh\") pod \"nova-cell1-db-create-jr6cc\" (UID: \"349ea677-2d9d-4506-b515-d5b03946ea88\") " pod="openstack/nova-cell1-db-create-jr6cc" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.321803 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5bb00ac-bb46-44ce-b2b1-537573b86f6e-operator-scripts\") pod \"nova-cell1-60a1-account-create-update-6j76b\" (UID: \"e5bb00ac-bb46-44ce-b2b1-537573b86f6e\") " pod="openstack/nova-cell1-60a1-account-create-update-6j76b" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.321864 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzqlt\" (UniqueName: \"kubernetes.io/projected/e5bb00ac-bb46-44ce-b2b1-537573b86f6e-kube-api-access-bzqlt\") pod \"nova-cell1-60a1-account-create-update-6j76b\" (UID: \"e5bb00ac-bb46-44ce-b2b1-537573b86f6e\") " pod="openstack/nova-cell1-60a1-account-create-update-6j76b" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.322039 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6923b91-d6e2-4672-be62-2531342086e1-operator-scripts\") pod \"nova-cell0-45e9-account-create-update-8lsk4\" (UID: \"c6923b91-d6e2-4672-be62-2531342086e1\") " pod="openstack/nova-cell0-45e9-account-create-update-8lsk4" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.322139 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z25qg\" (UniqueName: \"kubernetes.io/projected/c6923b91-d6e2-4672-be62-2531342086e1-kube-api-access-z25qg\") pod \"nova-cell0-45e9-account-create-update-8lsk4\" (UID: \"c6923b91-d6e2-4672-be62-2531342086e1\") " pod="openstack/nova-cell0-45e9-account-create-update-8lsk4" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.323188 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6923b91-d6e2-4672-be62-2531342086e1-operator-scripts\") pod \"nova-cell0-45e9-account-create-update-8lsk4\" (UID: \"c6923b91-d6e2-4672-be62-2531342086e1\") " pod="openstack/nova-cell0-45e9-account-create-update-8lsk4" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.339229 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z25qg\" (UniqueName: \"kubernetes.io/projected/c6923b91-d6e2-4672-be62-2531342086e1-kube-api-access-z25qg\") pod \"nova-cell0-45e9-account-create-update-8lsk4\" (UID: \"c6923b91-d6e2-4672-be62-2531342086e1\") " pod="openstack/nova-cell0-45e9-account-create-update-8lsk4" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.399147 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.399482 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.424354 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5bb00ac-bb46-44ce-b2b1-537573b86f6e-operator-scripts\") pod \"nova-cell1-60a1-account-create-update-6j76b\" (UID: \"e5bb00ac-bb46-44ce-b2b1-537573b86f6e\") " pod="openstack/nova-cell1-60a1-account-create-update-6j76b" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.424396 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzqlt\" (UniqueName: \"kubernetes.io/projected/e5bb00ac-bb46-44ce-b2b1-537573b86f6e-kube-api-access-bzqlt\") pod \"nova-cell1-60a1-account-create-update-6j76b\" (UID: \"e5bb00ac-bb46-44ce-b2b1-537573b86f6e\") " pod="openstack/nova-cell1-60a1-account-create-update-6j76b" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.425063 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5bb00ac-bb46-44ce-b2b1-537573b86f6e-operator-scripts\") pod \"nova-cell1-60a1-account-create-update-6j76b\" (UID: \"e5bb00ac-bb46-44ce-b2b1-537573b86f6e\") " pod="openstack/nova-cell1-60a1-account-create-update-6j76b" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.427286 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1a23-account-create-update-cldmj" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.440285 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzqlt\" (UniqueName: \"kubernetes.io/projected/e5bb00ac-bb46-44ce-b2b1-537573b86f6e-kube-api-access-bzqlt\") pod \"nova-cell1-60a1-account-create-update-6j76b\" (UID: \"e5bb00ac-bb46-44ce-b2b1-537573b86f6e\") " pod="openstack/nova-cell1-60a1-account-create-update-6j76b" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.449653 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jr6cc" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.476813 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-45e9-account-create-update-8lsk4" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.528724 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-60a1-account-create-update-6j76b" Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.552407 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.552644 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cd19b615-0c04-4fd9-968c-dceb17256b34" containerName="glance-log" containerID="cri-o://837a93d35bb23bc5d540f3124197ef476a37dccdb8f26d3ac00a99933f14c4d7" gracePeriod=30 Dec 08 09:24:11 crc kubenswrapper[4776]: I1208 09:24:11.552786 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cd19b615-0c04-4fd9-968c-dceb17256b34" containerName="glance-httpd" containerID="cri-o://81217682482f4d408f75b1fcb5a2c470b36a8dbb841bba1d1db1e01277eabbc3" gracePeriod=30 Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.143334 4776 generic.go:334] "Generic (PLEG): container finished" podID="8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" containerID="ee9ecef0b47f7d54e0dbe04ffcfe752ccb4a3af5961fd3824353e635925dcff3" exitCode=0 Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.147186 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0","Type":"ContainerDied","Data":"ee9ecef0b47f7d54e0dbe04ffcfe752ccb4a3af5961fd3824353e635925dcff3"} Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.173277 4776 generic.go:334] "Generic (PLEG): container finished" podID="cd19b615-0c04-4fd9-968c-dceb17256b34" containerID="837a93d35bb23bc5d540f3124197ef476a37dccdb8f26d3ac00a99933f14c4d7" exitCode=143 Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.174437 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd19b615-0c04-4fd9-968c-dceb17256b34","Type":"ContainerDied","Data":"837a93d35bb23bc5d540f3124197ef476a37dccdb8f26d3ac00a99933f14c4d7"} Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.432895 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.573058 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-combined-ca-bundle\") pod \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.573472 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-public-tls-certs\") pod \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.573514 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-config-data\") pod \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.573563 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-scripts\") pod \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.573614 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.574092 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-logs\") pod \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.574249 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-httpd-run\") pod \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.574352 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmzxr\" (UniqueName: \"kubernetes.io/projected/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-kube-api-access-qmzxr\") pod \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\" (UID: \"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0\") " Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.574919 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" (UID: "8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.575385 4776 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.575518 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-logs" (OuterVolumeSpecName: "logs") pod "8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" (UID: "8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.584702 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" (UID: "8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.585682 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-scripts" (OuterVolumeSpecName: "scripts") pod "8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" (UID: "8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.591618 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-kube-api-access-qmzxr" (OuterVolumeSpecName: "kube-api-access-qmzxr") pod "8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" (UID: "8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0"). InnerVolumeSpecName "kube-api-access-qmzxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.649424 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" (UID: "8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.682723 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.682764 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.682774 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-logs\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.682782 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmzxr\" (UniqueName: \"kubernetes.io/projected/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-kube-api-access-qmzxr\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.682792 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.702384 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" (UID: "8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.709022 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-nngf6"] Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.719565 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-config-data" (OuterVolumeSpecName: "config-data") pod "8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" (UID: "8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.745417 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.784643 4776 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.784999 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:12 crc kubenswrapper[4776]: I1208 09:24:12.785013 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.148845 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-98lsp"] Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.175255 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-45e9-account-create-update-8lsk4"] Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.200925 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jr6cc"] Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.209335 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-45e9-account-create-update-8lsk4" event={"ID":"c6923b91-d6e2-4672-be62-2531342086e1","Type":"ContainerStarted","Data":"1b4a5280b1f76247fa2664d0ec72fb676cebc6ed901a517619c5cd8278a37c2a"} Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.213476 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-98lsp" event={"ID":"694530fd-2851-4a16-b642-071a7fca1ec0","Type":"ContainerStarted","Data":"7e67b0b1a10b82a775462228ca261b8c0130aa238a27370c3d6a9c427ad7bf2e"} Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.221033 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-60a1-account-create-update-6j76b"] Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.222725 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" event={"ID":"dc053d70-b785-4b45-91be-49cbd27952d9","Type":"ContainerStarted","Data":"4907dcf9f2faa7f2cc873138ae497143d0e9422dae53538c76829fc51214ebef"} Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.229967 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.240566 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nngf6" event={"ID":"5d642614-8b52-4d92-ae93-d281a37339df","Type":"ContainerStarted","Data":"84a61abbbed46adc7d811a570f5178ad218c9fa9fd7845d2b2933812c18fcde8"} Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.240610 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nngf6" event={"ID":"5d642614-8b52-4d92-ae93-d281a37339df","Type":"ContainerStarted","Data":"b0481a7643f101fdf8e0c88f199857bf23ea5d8d262738186ba07eb1b2e42775"} Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.243451 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" event={"ID":"39ad8a82-0a3f-4f21-bf0f-a158bd903618","Type":"ContainerStarted","Data":"d1a34192616e1e482d67bb8764abc9f5a5fd6e2697c8c86df334f238a15a4bd4"} Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.243969 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.263050 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.263318 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0","Type":"ContainerDied","Data":"2cb3fde0c23ccea5ad93577a7da59eddd9809012304cc3df660f7425236846c3"} Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.263366 4776 scope.go:117] "RemoveContainer" containerID="ee9ecef0b47f7d54e0dbe04ffcfe752ccb4a3af5961fd3824353e635925dcff3" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.271508 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5577b84758-k9tb2" event={"ID":"c99fb0c4-9b67-42b9-87d6-13ae72903740","Type":"ContainerStarted","Data":"44910220aaf6fd61dafa8f059ae0cf4d984506ec8cfd361a11dc0b75451b76a1"} Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.272474 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.275558 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1a23-account-create-update-cldmj"] Dec 08 09:24:13 crc kubenswrapper[4776]: W1208 09:24:13.291216 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6727a0d_5792_4bf3_9d9b_a84ad470ba82.slice/crio-bc7805b99b906b10b65c01174402cd504a2e4ca4e36bf23884b239f80c134d08 WatchSource:0}: Error finding container bc7805b99b906b10b65c01174402cd504a2e4ca4e36bf23884b239f80c134d08: Status 404 returned error can't find the container with id bc7805b99b906b10b65c01174402cd504a2e4ca4e36bf23884b239f80c134d08 Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.324399 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" podStartSLOduration=6.324376648 podStartE2EDuration="6.324376648s" podCreationTimestamp="2025-12-08 09:24:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:13.253769294 +0000 UTC m=+1529.516994316" watchObservedRunningTime="2025-12-08 09:24:13.324376648 +0000 UTC m=+1529.587601670" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.326719 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-nngf6" podStartSLOduration=3.326709741 podStartE2EDuration="3.326709741s" podCreationTimestamp="2025-12-08 09:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:13.274548401 +0000 UTC m=+1529.537773423" watchObservedRunningTime="2025-12-08 09:24:13.326709741 +0000 UTC m=+1529.589934773" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.352392 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" podStartSLOduration=2.923664454 podStartE2EDuration="6.35237256s" podCreationTimestamp="2025-12-08 09:24:07 +0000 UTC" firstStartedPulling="2025-12-08 09:24:08.395288685 +0000 UTC m=+1524.658513707" lastFinishedPulling="2025-12-08 09:24:11.823996791 +0000 UTC m=+1528.087221813" observedRunningTime="2025-12-08 09:24:13.299892411 +0000 UTC m=+1529.563117433" watchObservedRunningTime="2025-12-08 09:24:13.35237256 +0000 UTC m=+1529.615597582" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.366114 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5577b84758-k9tb2" podStartSLOduration=3.291143404 podStartE2EDuration="6.366092818s" podCreationTimestamp="2025-12-08 09:24:07 +0000 UTC" firstStartedPulling="2025-12-08 09:24:08.752661044 +0000 UTC m=+1525.015886066" lastFinishedPulling="2025-12-08 09:24:11.827610458 +0000 UTC m=+1528.090835480" observedRunningTime="2025-12-08 09:24:13.316159238 +0000 UTC m=+1529.579384260" watchObservedRunningTime="2025-12-08 09:24:13.366092818 +0000 UTC m=+1529.629317830" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.410339 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.420479 4776 scope.go:117] "RemoveContainer" containerID="66a083d4a17692f828c222f667901a8c63cbc10353a50fcabcdc1b39b5aadae0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.440463 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.472222 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 09:24:13 crc kubenswrapper[4776]: E1208 09:24:13.472899 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" containerName="glance-httpd" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.472920 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" containerName="glance-httpd" Dec 08 09:24:13 crc kubenswrapper[4776]: E1208 09:24:13.472943 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" containerName="glance-log" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.472951 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" containerName="glance-log" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.473222 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" containerName="glance-log" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.473245 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" containerName="glance-httpd" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.474513 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.478260 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.478468 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.509096 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.614462 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.614601 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-logs\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.614642 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.614988 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.615043 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.615451 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.615675 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.616724 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksxx8\" (UniqueName: \"kubernetes.io/projected/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-kube-api-access-ksxx8\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.719787 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.719848 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-logs\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.719879 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.719904 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.719934 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.720336 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-logs\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.720355 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.720516 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.720539 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.720565 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.720627 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksxx8\" (UniqueName: \"kubernetes.io/projected/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-kube-api-access-ksxx8\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.726044 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.727083 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.729719 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.734300 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.737142 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksxx8\" (UniqueName: \"kubernetes.io/projected/8f9758b1-4ae1-47ae-8a45-14b0df4c8632-kube-api-access-ksxx8\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.783527 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"8f9758b1-4ae1-47ae-8a45-14b0df4c8632\") " pod="openstack/glance-default-external-api-0" Dec 08 09:24:13 crc kubenswrapper[4776]: I1208 09:24:13.870792 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.305826 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-60a1-account-create-update-6j76b" event={"ID":"e5bb00ac-bb46-44ce-b2b1-537573b86f6e","Type":"ContainerStarted","Data":"84333277d15abd7eb0d265bb97fe6f0126f6e880dcaf9f91bed4b0c84924c642"} Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.307214 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-60a1-account-create-update-6j76b" event={"ID":"e5bb00ac-bb46-44ce-b2b1-537573b86f6e","Type":"ContainerStarted","Data":"d2f6bdf26ba9f32895fa9531e1b26e63d9a34ccb37d8b6c141adb09d65093023"} Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.316714 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-45e9-account-create-update-8lsk4" event={"ID":"c6923b91-d6e2-4672-be62-2531342086e1","Type":"ContainerStarted","Data":"73991a1a2851bba40093ac2a4e54caa630682b8f3818c2abd7fd856925f66b75"} Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.327844 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-98lsp" event={"ID":"694530fd-2851-4a16-b642-071a7fca1ec0","Type":"ContainerStarted","Data":"0f2480a5437008e2df0d4f7b21b8d4db2fc224e3fe12a6cd0d32952ed3dd83b0"} Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.328717 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-60a1-account-create-update-6j76b" podStartSLOduration=3.328699555 podStartE2EDuration="3.328699555s" podCreationTimestamp="2025-12-08 09:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:14.327157364 +0000 UTC m=+1530.590382386" watchObservedRunningTime="2025-12-08 09:24:14.328699555 +0000 UTC m=+1530.591924577" Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.334561 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1a23-account-create-update-cldmj" event={"ID":"f6727a0d-5792-4bf3-9d9b-a84ad470ba82","Type":"ContainerStarted","Data":"169847b89dab5b9ed6111372c125a1d8e045c63c643ef8c2308b0723a3d90f5b"} Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.334604 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1a23-account-create-update-cldmj" event={"ID":"f6727a0d-5792-4bf3-9d9b-a84ad470ba82","Type":"ContainerStarted","Data":"bc7805b99b906b10b65c01174402cd504a2e4ca4e36bf23884b239f80c134d08"} Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.352728 4776 generic.go:334] "Generic (PLEG): container finished" podID="5d642614-8b52-4d92-ae93-d281a37339df" containerID="84a61abbbed46adc7d811a570f5178ad218c9fa9fd7845d2b2933812c18fcde8" exitCode=0 Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.378213 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0" path="/var/lib/kubelet/pods/8c7f76a4-e7c3-4534-ad0a-69ea872fa9d0/volumes" Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.393934 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nngf6" event={"ID":"5d642614-8b52-4d92-ae93-d281a37339df","Type":"ContainerDied","Data":"84a61abbbed46adc7d811a570f5178ad218c9fa9fd7845d2b2933812c18fcde8"} Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.394067 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jr6cc" event={"ID":"349ea677-2d9d-4506-b515-d5b03946ea88","Type":"ContainerStarted","Data":"cc0c566bd52ec36e88d85048dbc79441de63453e7cb604cd8b86a985de6b084e"} Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.394146 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jr6cc" event={"ID":"349ea677-2d9d-4506-b515-d5b03946ea88","Type":"ContainerStarted","Data":"911e44307f584abfcb4ad0f9eed5683fc6431dd3ea00536cd41780e9bc73d675"} Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.396807 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-45e9-account-create-update-8lsk4" podStartSLOduration=4.396795973 podStartE2EDuration="4.396795973s" podCreationTimestamp="2025-12-08 09:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:14.339279439 +0000 UTC m=+1530.602504461" watchObservedRunningTime="2025-12-08 09:24:14.396795973 +0000 UTC m=+1530.660020995" Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.427572 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-98lsp" podStartSLOduration=4.427549558 podStartE2EDuration="4.427549558s" podCreationTimestamp="2025-12-08 09:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:14.35757124 +0000 UTC m=+1530.620796262" watchObservedRunningTime="2025-12-08 09:24:14.427549558 +0000 UTC m=+1530.690774580" Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.450640 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-1a23-account-create-update-cldmj" podStartSLOduration=4.450619427 podStartE2EDuration="4.450619427s" podCreationTimestamp="2025-12-08 09:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:14.372156611 +0000 UTC m=+1530.635381633" watchObservedRunningTime="2025-12-08 09:24:14.450619427 +0000 UTC m=+1530.713844449" Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.463966 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-jr6cc" podStartSLOduration=4.463946945 podStartE2EDuration="4.463946945s" podCreationTimestamp="2025-12-08 09:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:14.387581415 +0000 UTC m=+1530.650806437" watchObservedRunningTime="2025-12-08 09:24:14.463946945 +0000 UTC m=+1530.727171957" Dec 08 09:24:14 crc kubenswrapper[4776]: I1208 09:24:14.487434 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.109063 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5ccd9d555d-m9chd"] Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.111215 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.126195 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5ccd9d555d-m9chd"] Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.149798 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-886cc84d4-qdcjh"] Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.151415 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.194230 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-886cc84d4-qdcjh"] Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.214637 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-785fc66866-ns7kd"] Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.217427 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.229681 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-785fc66866-ns7kd"] Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.258769 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qw7v\" (UniqueName: \"kubernetes.io/projected/40b6ce41-e108-47bf-bc38-34e8c475b413-kube-api-access-9qw7v\") pod \"heat-engine-5ccd9d555d-m9chd\" (UID: \"40b6ce41-e108-47bf-bc38-34e8c475b413\") " pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.258850 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-config-data\") pod \"heat-api-886cc84d4-qdcjh\" (UID: \"773f52b6-826d-4179-8777-96d795b10c5d\") " pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.258907 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-config-data-custom\") pod \"heat-engine-5ccd9d555d-m9chd\" (UID: \"40b6ce41-e108-47bf-bc38-34e8c475b413\") " pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.258930 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-combined-ca-bundle\") pod \"heat-api-886cc84d4-qdcjh\" (UID: \"773f52b6-826d-4179-8777-96d795b10c5d\") " pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.259256 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-combined-ca-bundle\") pod \"heat-engine-5ccd9d555d-m9chd\" (UID: \"40b6ce41-e108-47bf-bc38-34e8c475b413\") " pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.259310 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-config-data\") pod \"heat-engine-5ccd9d555d-m9chd\" (UID: \"40b6ce41-e108-47bf-bc38-34e8c475b413\") " pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.259431 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92bgk\" (UniqueName: \"kubernetes.io/projected/773f52b6-826d-4179-8777-96d795b10c5d-kube-api-access-92bgk\") pod \"heat-api-886cc84d4-qdcjh\" (UID: \"773f52b6-826d-4179-8777-96d795b10c5d\") " pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.259524 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-config-data-custom\") pod \"heat-api-886cc84d4-qdcjh\" (UID: \"773f52b6-826d-4179-8777-96d795b10c5d\") " pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.363481 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-config-data\") pod \"heat-api-886cc84d4-qdcjh\" (UID: \"773f52b6-826d-4179-8777-96d795b10c5d\") " pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.363542 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-config-data-custom\") pod \"heat-cfnapi-785fc66866-ns7kd\" (UID: \"edc53406-6d30-4981-aa38-cd183ebf1b7d\") " pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.363579 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt5pf\" (UniqueName: \"kubernetes.io/projected/edc53406-6d30-4981-aa38-cd183ebf1b7d-kube-api-access-rt5pf\") pod \"heat-cfnapi-785fc66866-ns7kd\" (UID: \"edc53406-6d30-4981-aa38-cd183ebf1b7d\") " pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.363634 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-config-data-custom\") pod \"heat-engine-5ccd9d555d-m9chd\" (UID: \"40b6ce41-e108-47bf-bc38-34e8c475b413\") " pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.363663 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-combined-ca-bundle\") pod \"heat-api-886cc84d4-qdcjh\" (UID: \"773f52b6-826d-4179-8777-96d795b10c5d\") " pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.363769 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-combined-ca-bundle\") pod \"heat-engine-5ccd9d555d-m9chd\" (UID: \"40b6ce41-e108-47bf-bc38-34e8c475b413\") " pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.363794 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-config-data\") pod \"heat-engine-5ccd9d555d-m9chd\" (UID: \"40b6ce41-e108-47bf-bc38-34e8c475b413\") " pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.363827 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92bgk\" (UniqueName: \"kubernetes.io/projected/773f52b6-826d-4179-8777-96d795b10c5d-kube-api-access-92bgk\") pod \"heat-api-886cc84d4-qdcjh\" (UID: \"773f52b6-826d-4179-8777-96d795b10c5d\") " pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.363869 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-config-data-custom\") pod \"heat-api-886cc84d4-qdcjh\" (UID: \"773f52b6-826d-4179-8777-96d795b10c5d\") " pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.363908 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-config-data\") pod \"heat-cfnapi-785fc66866-ns7kd\" (UID: \"edc53406-6d30-4981-aa38-cd183ebf1b7d\") " pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.363943 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-combined-ca-bundle\") pod \"heat-cfnapi-785fc66866-ns7kd\" (UID: \"edc53406-6d30-4981-aa38-cd183ebf1b7d\") " pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.363981 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qw7v\" (UniqueName: \"kubernetes.io/projected/40b6ce41-e108-47bf-bc38-34e8c475b413-kube-api-access-9qw7v\") pod \"heat-engine-5ccd9d555d-m9chd\" (UID: \"40b6ce41-e108-47bf-bc38-34e8c475b413\") " pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.374509 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-config-data\") pod \"heat-engine-5ccd9d555d-m9chd\" (UID: \"40b6ce41-e108-47bf-bc38-34e8c475b413\") " pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.378215 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-combined-ca-bundle\") pod \"heat-api-886cc84d4-qdcjh\" (UID: \"773f52b6-826d-4179-8777-96d795b10c5d\") " pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.378991 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-config-data\") pod \"heat-api-886cc84d4-qdcjh\" (UID: \"773f52b6-826d-4179-8777-96d795b10c5d\") " pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.379995 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-combined-ca-bundle\") pod \"heat-engine-5ccd9d555d-m9chd\" (UID: \"40b6ce41-e108-47bf-bc38-34e8c475b413\") " pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.385790 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-config-data-custom\") pod \"heat-engine-5ccd9d555d-m9chd\" (UID: \"40b6ce41-e108-47bf-bc38-34e8c475b413\") " pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.385805 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92bgk\" (UniqueName: \"kubernetes.io/projected/773f52b6-826d-4179-8777-96d795b10c5d-kube-api-access-92bgk\") pod \"heat-api-886cc84d4-qdcjh\" (UID: \"773f52b6-826d-4179-8777-96d795b10c5d\") " pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.385867 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qw7v\" (UniqueName: \"kubernetes.io/projected/40b6ce41-e108-47bf-bc38-34e8c475b413-kube-api-access-9qw7v\") pod \"heat-engine-5ccd9d555d-m9chd\" (UID: \"40b6ce41-e108-47bf-bc38-34e8c475b413\") " pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.386403 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-config-data-custom\") pod \"heat-api-886cc84d4-qdcjh\" (UID: \"773f52b6-826d-4179-8777-96d795b10c5d\") " pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.396547 4776 generic.go:334] "Generic (PLEG): container finished" podID="694530fd-2851-4a16-b642-071a7fca1ec0" containerID="0f2480a5437008e2df0d4f7b21b8d4db2fc224e3fe12a6cd0d32952ed3dd83b0" exitCode=0 Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.396616 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-98lsp" event={"ID":"694530fd-2851-4a16-b642-071a7fca1ec0","Type":"ContainerDied","Data":"0f2480a5437008e2df0d4f7b21b8d4db2fc224e3fe12a6cd0d32952ed3dd83b0"} Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.399543 4776 generic.go:334] "Generic (PLEG): container finished" podID="cd19b615-0c04-4fd9-968c-dceb17256b34" containerID="81217682482f4d408f75b1fcb5a2c470b36a8dbb841bba1d1db1e01277eabbc3" exitCode=0 Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.399590 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd19b615-0c04-4fd9-968c-dceb17256b34","Type":"ContainerDied","Data":"81217682482f4d408f75b1fcb5a2c470b36a8dbb841bba1d1db1e01277eabbc3"} Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.400932 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8f9758b1-4ae1-47ae-8a45-14b0df4c8632","Type":"ContainerStarted","Data":"e49ed8acd10403c14a9ec77dc16c831b892bc82c97ffd4015fdbe53df3ad049e"} Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.402708 4776 generic.go:334] "Generic (PLEG): container finished" podID="349ea677-2d9d-4506-b515-d5b03946ea88" containerID="cc0c566bd52ec36e88d85048dbc79441de63453e7cb604cd8b86a985de6b084e" exitCode=0 Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.402753 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jr6cc" event={"ID":"349ea677-2d9d-4506-b515-d5b03946ea88","Type":"ContainerDied","Data":"cc0c566bd52ec36e88d85048dbc79441de63453e7cb604cd8b86a985de6b084e"} Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.405038 4776 generic.go:334] "Generic (PLEG): container finished" podID="e5bb00ac-bb46-44ce-b2b1-537573b86f6e" containerID="84333277d15abd7eb0d265bb97fe6f0126f6e880dcaf9f91bed4b0c84924c642" exitCode=0 Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.406478 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-60a1-account-create-update-6j76b" event={"ID":"e5bb00ac-bb46-44ce-b2b1-537573b86f6e","Type":"ContainerDied","Data":"84333277d15abd7eb0d265bb97fe6f0126f6e880dcaf9f91bed4b0c84924c642"} Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.434916 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.466708 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-config-data\") pod \"heat-cfnapi-785fc66866-ns7kd\" (UID: \"edc53406-6d30-4981-aa38-cd183ebf1b7d\") " pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.466767 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-combined-ca-bundle\") pod \"heat-cfnapi-785fc66866-ns7kd\" (UID: \"edc53406-6d30-4981-aa38-cd183ebf1b7d\") " pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.466869 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-config-data-custom\") pod \"heat-cfnapi-785fc66866-ns7kd\" (UID: \"edc53406-6d30-4981-aa38-cd183ebf1b7d\") " pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.466918 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt5pf\" (UniqueName: \"kubernetes.io/projected/edc53406-6d30-4981-aa38-cd183ebf1b7d-kube-api-access-rt5pf\") pod \"heat-cfnapi-785fc66866-ns7kd\" (UID: \"edc53406-6d30-4981-aa38-cd183ebf1b7d\") " pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.474665 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-config-data-custom\") pod \"heat-cfnapi-785fc66866-ns7kd\" (UID: \"edc53406-6d30-4981-aa38-cd183ebf1b7d\") " pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.475334 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-combined-ca-bundle\") pod \"heat-cfnapi-785fc66866-ns7kd\" (UID: \"edc53406-6d30-4981-aa38-cd183ebf1b7d\") " pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.478071 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-config-data\") pod \"heat-cfnapi-785fc66866-ns7kd\" (UID: \"edc53406-6d30-4981-aa38-cd183ebf1b7d\") " pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.489749 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt5pf\" (UniqueName: \"kubernetes.io/projected/edc53406-6d30-4981-aa38-cd183ebf1b7d-kube-api-access-rt5pf\") pod \"heat-cfnapi-785fc66866-ns7kd\" (UID: \"edc53406-6d30-4981-aa38-cd183ebf1b7d\") " pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.508142 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.541242 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:15 crc kubenswrapper[4776]: I1208 09:24:15.973971 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nngf6" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.093057 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d642614-8b52-4d92-ae93-d281a37339df-operator-scripts\") pod \"5d642614-8b52-4d92-ae93-d281a37339df\" (UID: \"5d642614-8b52-4d92-ae93-d281a37339df\") " Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.093326 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7frr6\" (UniqueName: \"kubernetes.io/projected/5d642614-8b52-4d92-ae93-d281a37339df-kube-api-access-7frr6\") pod \"5d642614-8b52-4d92-ae93-d281a37339df\" (UID: \"5d642614-8b52-4d92-ae93-d281a37339df\") " Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.095455 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d642614-8b52-4d92-ae93-d281a37339df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d642614-8b52-4d92-ae93-d281a37339df" (UID: "5d642614-8b52-4d92-ae93-d281a37339df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.099443 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d642614-8b52-4d92-ae93-d281a37339df-kube-api-access-7frr6" (OuterVolumeSpecName: "kube-api-access-7frr6") pod "5d642614-8b52-4d92-ae93-d281a37339df" (UID: "5d642614-8b52-4d92-ae93-d281a37339df"). InnerVolumeSpecName "kube-api-access-7frr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.196102 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d642614-8b52-4d92-ae93-d281a37339df-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.196138 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7frr6\" (UniqueName: \"kubernetes.io/projected/5d642614-8b52-4d92-ae93-d281a37339df-kube-api-access-7frr6\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.263802 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5ccd9d555d-m9chd"] Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.462258 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-886cc84d4-qdcjh"] Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.573568 4776 generic.go:334] "Generic (PLEG): container finished" podID="f6727a0d-5792-4bf3-9d9b-a84ad470ba82" containerID="169847b89dab5b9ed6111372c125a1d8e045c63c643ef8c2308b0723a3d90f5b" exitCode=0 Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.573652 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1a23-account-create-update-cldmj" event={"ID":"f6727a0d-5792-4bf3-9d9b-a84ad470ba82","Type":"ContainerDied","Data":"169847b89dab5b9ed6111372c125a1d8e045c63c643ef8c2308b0723a3d90f5b"} Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.588032 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nngf6" event={"ID":"5d642614-8b52-4d92-ae93-d281a37339df","Type":"ContainerDied","Data":"b0481a7643f101fdf8e0c88f199857bf23ea5d8d262738186ba07eb1b2e42775"} Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.588210 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0481a7643f101fdf8e0c88f199857bf23ea5d8d262738186ba07eb1b2e42775" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.588320 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nngf6" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.640471 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.641113 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8f9758b1-4ae1-47ae-8a45-14b0df4c8632","Type":"ContainerStarted","Data":"55ed037d71a5ed1bc0f42bd89ab558f85402c0d5dd34a7b20d273bbc528a5da3"} Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.645077 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-785fc66866-ns7kd"] Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.645945 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5ccd9d555d-m9chd" event={"ID":"40b6ce41-e108-47bf-bc38-34e8c475b413","Type":"ContainerStarted","Data":"7bfdc9e480299420f87cbe5b57642b47aa8cc68291d1cae955a6e9e78baa4707"} Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.655223 4776 generic.go:334] "Generic (PLEG): container finished" podID="c6923b91-d6e2-4672-be62-2531342086e1" containerID="73991a1a2851bba40093ac2a4e54caa630682b8f3818c2abd7fd856925f66b75" exitCode=0 Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.656110 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-45e9-account-create-update-8lsk4" event={"ID":"c6923b91-d6e2-4672-be62-2531342086e1","Type":"ContainerDied","Data":"73991a1a2851bba40093ac2a4e54caa630682b8f3818c2abd7fd856925f66b75"} Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.726791 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-combined-ca-bundle\") pod \"cd19b615-0c04-4fd9-968c-dceb17256b34\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.727023 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-internal-tls-certs\") pod \"cd19b615-0c04-4fd9-968c-dceb17256b34\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.727059 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd19b615-0c04-4fd9-968c-dceb17256b34-logs\") pod \"cd19b615-0c04-4fd9-968c-dceb17256b34\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.727091 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd19b615-0c04-4fd9-968c-dceb17256b34-httpd-run\") pod \"cd19b615-0c04-4fd9-968c-dceb17256b34\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.727115 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b2pg\" (UniqueName: \"kubernetes.io/projected/cd19b615-0c04-4fd9-968c-dceb17256b34-kube-api-access-8b2pg\") pod \"cd19b615-0c04-4fd9-968c-dceb17256b34\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.727155 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-scripts\") pod \"cd19b615-0c04-4fd9-968c-dceb17256b34\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.727191 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cd19b615-0c04-4fd9-968c-dceb17256b34\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.727275 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-config-data\") pod \"cd19b615-0c04-4fd9-968c-dceb17256b34\" (UID: \"cd19b615-0c04-4fd9-968c-dceb17256b34\") " Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.727738 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd19b615-0c04-4fd9-968c-dceb17256b34-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cd19b615-0c04-4fd9-968c-dceb17256b34" (UID: "cd19b615-0c04-4fd9-968c-dceb17256b34"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.729452 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd19b615-0c04-4fd9-968c-dceb17256b34-logs" (OuterVolumeSpecName: "logs") pod "cd19b615-0c04-4fd9-968c-dceb17256b34" (UID: "cd19b615-0c04-4fd9-968c-dceb17256b34"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.754698 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd19b615-0c04-4fd9-968c-dceb17256b34-kube-api-access-8b2pg" (OuterVolumeSpecName: "kube-api-access-8b2pg") pod "cd19b615-0c04-4fd9-968c-dceb17256b34" (UID: "cd19b615-0c04-4fd9-968c-dceb17256b34"). InnerVolumeSpecName "kube-api-access-8b2pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.754780 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-scripts" (OuterVolumeSpecName: "scripts") pod "cd19b615-0c04-4fd9-968c-dceb17256b34" (UID: "cd19b615-0c04-4fd9-968c-dceb17256b34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.760005 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "cd19b615-0c04-4fd9-968c-dceb17256b34" (UID: "cd19b615-0c04-4fd9-968c-dceb17256b34"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.809061 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd19b615-0c04-4fd9-968c-dceb17256b34" (UID: "cd19b615-0c04-4fd9-968c-dceb17256b34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.833602 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd19b615-0c04-4fd9-968c-dceb17256b34-logs\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.833631 4776 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd19b615-0c04-4fd9-968c-dceb17256b34-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.833645 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b2pg\" (UniqueName: \"kubernetes.io/projected/cd19b615-0c04-4fd9-968c-dceb17256b34-kube-api-access-8b2pg\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.833656 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.833681 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 08 09:24:16 crc kubenswrapper[4776]: I1208 09:24:16.833696 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.138201 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.146516 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cd19b615-0c04-4fd9-968c-dceb17256b34" (UID: "cd19b615-0c04-4fd9-968c-dceb17256b34"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.148280 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.148355 4776 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.163324 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-config-data" (OuterVolumeSpecName: "config-data") pod "cd19b615-0c04-4fd9-968c-dceb17256b34" (UID: "cd19b615-0c04-4fd9-968c-dceb17256b34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.250914 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd19b615-0c04-4fd9-968c-dceb17256b34-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.411469 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jr6cc" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.563836 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfxxh\" (UniqueName: \"kubernetes.io/projected/349ea677-2d9d-4506-b515-d5b03946ea88-kube-api-access-zfxxh\") pod \"349ea677-2d9d-4506-b515-d5b03946ea88\" (UID: \"349ea677-2d9d-4506-b515-d5b03946ea88\") " Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.564340 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/349ea677-2d9d-4506-b515-d5b03946ea88-operator-scripts\") pod \"349ea677-2d9d-4506-b515-d5b03946ea88\" (UID: \"349ea677-2d9d-4506-b515-d5b03946ea88\") " Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.567025 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/349ea677-2d9d-4506-b515-d5b03946ea88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "349ea677-2d9d-4506-b515-d5b03946ea88" (UID: "349ea677-2d9d-4506-b515-d5b03946ea88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.592006 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/349ea677-2d9d-4506-b515-d5b03946ea88-kube-api-access-zfxxh" (OuterVolumeSpecName: "kube-api-access-zfxxh") pod "349ea677-2d9d-4506-b515-d5b03946ea88" (UID: "349ea677-2d9d-4506-b515-d5b03946ea88"). InnerVolumeSpecName "kube-api-access-zfxxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.612109 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-98lsp" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.620238 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-60a1-account-create-update-6j76b" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.667773 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfxxh\" (UniqueName: \"kubernetes.io/projected/349ea677-2d9d-4506-b515-d5b03946ea88-kube-api-access-zfxxh\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.667810 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/349ea677-2d9d-4506-b515-d5b03946ea88-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.690779 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8f9758b1-4ae1-47ae-8a45-14b0df4c8632","Type":"ContainerStarted","Data":"dde03feca0ec4f33c1bf7269ec2f2f28bc2dfdc3ab525f039180d0941e1f4fb8"} Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.707928 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jr6cc" event={"ID":"349ea677-2d9d-4506-b515-d5b03946ea88","Type":"ContainerDied","Data":"911e44307f584abfcb4ad0f9eed5683fc6431dd3ea00536cd41780e9bc73d675"} Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.707967 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="911e44307f584abfcb4ad0f9eed5683fc6431dd3ea00536cd41780e9bc73d675" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.708033 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jr6cc" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.725420 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-60a1-account-create-update-6j76b" event={"ID":"e5bb00ac-bb46-44ce-b2b1-537573b86f6e","Type":"ContainerDied","Data":"d2f6bdf26ba9f32895fa9531e1b26e63d9a34ccb37d8b6c141adb09d65093023"} Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.725459 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2f6bdf26ba9f32895fa9531e1b26e63d9a34ccb37d8b6c141adb09d65093023" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.725463 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-60a1-account-create-update-6j76b" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.730861 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.730845079 podStartE2EDuration="4.730845079s" podCreationTimestamp="2025-12-08 09:24:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:17.722080574 +0000 UTC m=+1533.985305596" watchObservedRunningTime="2025-12-08 09:24:17.730845079 +0000 UTC m=+1533.994070101" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.731953 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-886cc84d4-qdcjh" event={"ID":"773f52b6-826d-4179-8777-96d795b10c5d","Type":"ContainerStarted","Data":"b9345d39e1a66e29cb6052c1d6c7b43edd9dc2df0f1afe69e91fbc71033257f5"} Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.731983 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-886cc84d4-qdcjh" event={"ID":"773f52b6-826d-4179-8777-96d795b10c5d","Type":"ContainerStarted","Data":"86ab0714f014f83b10dbce52721676a6817ee1acb4b610c35795629d6a2001ec"} Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.732035 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.739818 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5ccd9d555d-m9chd" event={"ID":"40b6ce41-e108-47bf-bc38-34e8c475b413","Type":"ContainerStarted","Data":"6aa12b16012116b86c2b0c97e985bec0ebaeb9a366799a44caf810d29f286456"} Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.740609 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.754774 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-785fc66866-ns7kd" event={"ID":"edc53406-6d30-4981-aa38-cd183ebf1b7d","Type":"ContainerStarted","Data":"52ca12eabecbad5612d04acd98993d5c242c5c40ca59257c1d760002b46ce967"} Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.754818 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-785fc66866-ns7kd" event={"ID":"edc53406-6d30-4981-aa38-cd183ebf1b7d","Type":"ContainerStarted","Data":"aa242ccfedd7c7b9408244e973e1a5cba13591ecb21bd9bb73275ca86c10063b"} Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.755718 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.762605 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-886cc84d4-qdcjh" podStartSLOduration=2.76258287 podStartE2EDuration="2.76258287s" podCreationTimestamp="2025-12-08 09:24:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:17.755436838 +0000 UTC m=+1534.018661860" watchObservedRunningTime="2025-12-08 09:24:17.76258287 +0000 UTC m=+1534.025807892" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.764502 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.768939 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzqlt\" (UniqueName: \"kubernetes.io/projected/e5bb00ac-bb46-44ce-b2b1-537573b86f6e-kube-api-access-bzqlt\") pod \"e5bb00ac-bb46-44ce-b2b1-537573b86f6e\" (UID: \"e5bb00ac-bb46-44ce-b2b1-537573b86f6e\") " Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.769030 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d5gs\" (UniqueName: \"kubernetes.io/projected/694530fd-2851-4a16-b642-071a7fca1ec0-kube-api-access-5d5gs\") pod \"694530fd-2851-4a16-b642-071a7fca1ec0\" (UID: \"694530fd-2851-4a16-b642-071a7fca1ec0\") " Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.769139 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5bb00ac-bb46-44ce-b2b1-537573b86f6e-operator-scripts\") pod \"e5bb00ac-bb46-44ce-b2b1-537573b86f6e\" (UID: \"e5bb00ac-bb46-44ce-b2b1-537573b86f6e\") " Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.769363 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/694530fd-2851-4a16-b642-071a7fca1ec0-operator-scripts\") pod \"694530fd-2851-4a16-b642-071a7fca1ec0\" (UID: \"694530fd-2851-4a16-b642-071a7fca1ec0\") " Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.771721 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5bb00ac-bb46-44ce-b2b1-537573b86f6e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5bb00ac-bb46-44ce-b2b1-537573b86f6e" (UID: "e5bb00ac-bb46-44ce-b2b1-537573b86f6e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.772245 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/694530fd-2851-4a16-b642-071a7fca1ec0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "694530fd-2851-4a16-b642-071a7fca1ec0" (UID: "694530fd-2851-4a16-b642-071a7fca1ec0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.772957 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-98lsp" event={"ID":"694530fd-2851-4a16-b642-071a7fca1ec0","Type":"ContainerDied","Data":"7e67b0b1a10b82a775462228ca261b8c0130aa238a27370c3d6a9c427ad7bf2e"} Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.772983 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e67b0b1a10b82a775462228ca261b8c0130aa238a27370c3d6a9c427ad7bf2e" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.773038 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-98lsp" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.778013 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5bb00ac-bb46-44ce-b2b1-537573b86f6e-kube-api-access-bzqlt" (OuterVolumeSpecName: "kube-api-access-bzqlt") pod "e5bb00ac-bb46-44ce-b2b1-537573b86f6e" (UID: "e5bb00ac-bb46-44ce-b2b1-537573b86f6e"). InnerVolumeSpecName "kube-api-access-bzqlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.780255 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/694530fd-2851-4a16-b642-071a7fca1ec0-kube-api-access-5d5gs" (OuterVolumeSpecName: "kube-api-access-5d5gs") pod "694530fd-2851-4a16-b642-071a7fca1ec0" (UID: "694530fd-2851-4a16-b642-071a7fca1ec0"). InnerVolumeSpecName "kube-api-access-5d5gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.783605 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.784017 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd19b615-0c04-4fd9-968c-dceb17256b34","Type":"ContainerDied","Data":"472711f1218e52f1d5eb139a050b8e3b93a5a484b0306bb71630afd4e047488a"} Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.784090 4776 scope.go:117] "RemoveContainer" containerID="81217682482f4d408f75b1fcb5a2c470b36a8dbb841bba1d1db1e01277eabbc3" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.785125 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-785fc66866-ns7kd" podStartSLOduration=2.785110665 podStartE2EDuration="2.785110665s" podCreationTimestamp="2025-12-08 09:24:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:17.784122209 +0000 UTC m=+1534.047347231" watchObservedRunningTime="2025-12-08 09:24:17.785110665 +0000 UTC m=+1534.048335687" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.826783 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5ccd9d555d-m9chd" podStartSLOduration=2.826762242 podStartE2EDuration="2.826762242s" podCreationTimestamp="2025-12-08 09:24:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:17.807525706 +0000 UTC m=+1534.070750728" watchObservedRunningTime="2025-12-08 09:24:17.826762242 +0000 UTC m=+1534.089987254" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.871772 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzqlt\" (UniqueName: \"kubernetes.io/projected/e5bb00ac-bb46-44ce-b2b1-537573b86f6e-kube-api-access-bzqlt\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.871807 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d5gs\" (UniqueName: \"kubernetes.io/projected/694530fd-2851-4a16-b642-071a7fca1ec0-kube-api-access-5d5gs\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.871819 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5bb00ac-bb46-44ce-b2b1-537573b86f6e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.871827 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/694530fd-2851-4a16-b642-071a7fca1ec0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.897929 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lpj75"] Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.898227 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" podUID="5097e9b7-c005-4e68-bc20-bbd6f8b8a290" containerName="dnsmasq-dns" containerID="cri-o://41725b711dd1839ad256cdbebf464427d14539155fc5d5914b3ae781490b9040" gracePeriod=10 Dec 08 09:24:17 crc kubenswrapper[4776]: I1208 09:24:17.978587 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.015302 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.021802 4776 scope.go:117] "RemoveContainer" containerID="837a93d35bb23bc5d540f3124197ef476a37dccdb8f26d3ac00a99933f14c4d7" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.033489 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 09:24:18 crc kubenswrapper[4776]: E1208 09:24:18.034663 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5bb00ac-bb46-44ce-b2b1-537573b86f6e" containerName="mariadb-account-create-update" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.034679 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bb00ac-bb46-44ce-b2b1-537573b86f6e" containerName="mariadb-account-create-update" Dec 08 09:24:18 crc kubenswrapper[4776]: E1208 09:24:18.034690 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd19b615-0c04-4fd9-968c-dceb17256b34" containerName="glance-httpd" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.034696 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd19b615-0c04-4fd9-968c-dceb17256b34" containerName="glance-httpd" Dec 08 09:24:18 crc kubenswrapper[4776]: E1208 09:24:18.034719 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d642614-8b52-4d92-ae93-d281a37339df" containerName="mariadb-database-create" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.034726 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d642614-8b52-4d92-ae93-d281a37339df" containerName="mariadb-database-create" Dec 08 09:24:18 crc kubenswrapper[4776]: E1208 09:24:18.034751 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd19b615-0c04-4fd9-968c-dceb17256b34" containerName="glance-log" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.034758 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd19b615-0c04-4fd9-968c-dceb17256b34" containerName="glance-log" Dec 08 09:24:18 crc kubenswrapper[4776]: E1208 09:24:18.034771 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349ea677-2d9d-4506-b515-d5b03946ea88" containerName="mariadb-database-create" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.034776 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="349ea677-2d9d-4506-b515-d5b03946ea88" containerName="mariadb-database-create" Dec 08 09:24:18 crc kubenswrapper[4776]: E1208 09:24:18.034785 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="694530fd-2851-4a16-b642-071a7fca1ec0" containerName="mariadb-database-create" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.034791 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="694530fd-2851-4a16-b642-071a7fca1ec0" containerName="mariadb-database-create" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.035003 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5bb00ac-bb46-44ce-b2b1-537573b86f6e" containerName="mariadb-account-create-update" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.035021 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="694530fd-2851-4a16-b642-071a7fca1ec0" containerName="mariadb-database-create" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.035034 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd19b615-0c04-4fd9-968c-dceb17256b34" containerName="glance-httpd" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.035043 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d642614-8b52-4d92-ae93-d281a37339df" containerName="mariadb-database-create" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.035060 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="349ea677-2d9d-4506-b515-d5b03946ea88" containerName="mariadb-database-create" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.035068 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd19b615-0c04-4fd9-968c-dceb17256b34" containerName="glance-log" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.039303 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.044461 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.044619 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.062641 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.076328 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/28ffab6e-5596-4c63-b58a-4417489fc47b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.076392 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28ffab6e-5596-4c63-b58a-4417489fc47b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.076434 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28ffab6e-5596-4c63-b58a-4417489fc47b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.076465 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28ffab6e-5596-4c63-b58a-4417489fc47b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.076483 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28ffab6e-5596-4c63-b58a-4417489fc47b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.076537 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt9lr\" (UniqueName: \"kubernetes.io/projected/28ffab6e-5596-4c63-b58a-4417489fc47b-kube-api-access-nt9lr\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.076592 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.076651 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28ffab6e-5596-4c63-b58a-4417489fc47b-logs\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.183046 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28ffab6e-5596-4c63-b58a-4417489fc47b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.183107 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28ffab6e-5596-4c63-b58a-4417489fc47b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.183131 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28ffab6e-5596-4c63-b58a-4417489fc47b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.183163 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt9lr\" (UniqueName: \"kubernetes.io/projected/28ffab6e-5596-4c63-b58a-4417489fc47b-kube-api-access-nt9lr\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.183245 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.183319 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28ffab6e-5596-4c63-b58a-4417489fc47b-logs\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.183378 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/28ffab6e-5596-4c63-b58a-4417489fc47b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.183424 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28ffab6e-5596-4c63-b58a-4417489fc47b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.188883 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28ffab6e-5596-4c63-b58a-4417489fc47b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.189257 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.193275 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/28ffab6e-5596-4c63-b58a-4417489fc47b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.195674 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28ffab6e-5596-4c63-b58a-4417489fc47b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.197216 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28ffab6e-5596-4c63-b58a-4417489fc47b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.197461 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28ffab6e-5596-4c63-b58a-4417489fc47b-logs\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.205469 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt9lr\" (UniqueName: \"kubernetes.io/projected/28ffab6e-5596-4c63-b58a-4417489fc47b-kube-api-access-nt9lr\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.207365 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28ffab6e-5596-4c63-b58a-4417489fc47b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.237400 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"28ffab6e-5596-4c63-b58a-4417489fc47b\") " pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.368006 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd19b615-0c04-4fd9-968c-dceb17256b34" path="/var/lib/kubelet/pods/cd19b615-0c04-4fd9-968c-dceb17256b34/volumes" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.416396 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.663669 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-45e9-account-create-update-8lsk4" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.713957 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z25qg\" (UniqueName: \"kubernetes.io/projected/c6923b91-d6e2-4672-be62-2531342086e1-kube-api-access-z25qg\") pod \"c6923b91-d6e2-4672-be62-2531342086e1\" (UID: \"c6923b91-d6e2-4672-be62-2531342086e1\") " Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.714379 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6923b91-d6e2-4672-be62-2531342086e1-operator-scripts\") pod \"c6923b91-d6e2-4672-be62-2531342086e1\" (UID: \"c6923b91-d6e2-4672-be62-2531342086e1\") " Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.717034 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6923b91-d6e2-4672-be62-2531342086e1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c6923b91-d6e2-4672-be62-2531342086e1" (UID: "c6923b91-d6e2-4672-be62-2531342086e1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.725408 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6923b91-d6e2-4672-be62-2531342086e1-kube-api-access-z25qg" (OuterVolumeSpecName: "kube-api-access-z25qg") pod "c6923b91-d6e2-4672-be62-2531342086e1" (UID: "c6923b91-d6e2-4672-be62-2531342086e1"). InnerVolumeSpecName "kube-api-access-z25qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.816879 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6923b91-d6e2-4672-be62-2531342086e1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.816906 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z25qg\" (UniqueName: \"kubernetes.io/projected/c6923b91-d6e2-4672-be62-2531342086e1-kube-api-access-z25qg\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.919349 4776 generic.go:334] "Generic (PLEG): container finished" podID="773f52b6-826d-4179-8777-96d795b10c5d" containerID="b9345d39e1a66e29cb6052c1d6c7b43edd9dc2df0f1afe69e91fbc71033257f5" exitCode=1 Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.919420 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-886cc84d4-qdcjh" event={"ID":"773f52b6-826d-4179-8777-96d795b10c5d","Type":"ContainerDied","Data":"b9345d39e1a66e29cb6052c1d6c7b43edd9dc2df0f1afe69e91fbc71033257f5"} Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.920120 4776 scope.go:117] "RemoveContainer" containerID="b9345d39e1a66e29cb6052c1d6c7b43edd9dc2df0f1afe69e91fbc71033257f5" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.945494 4776 generic.go:334] "Generic (PLEG): container finished" podID="edc53406-6d30-4981-aa38-cd183ebf1b7d" containerID="52ca12eabecbad5612d04acd98993d5c242c5c40ca59257c1d760002b46ce967" exitCode=1 Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.945603 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-785fc66866-ns7kd" event={"ID":"edc53406-6d30-4981-aa38-cd183ebf1b7d","Type":"ContainerDied","Data":"52ca12eabecbad5612d04acd98993d5c242c5c40ca59257c1d760002b46ce967"} Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.946484 4776 scope.go:117] "RemoveContainer" containerID="52ca12eabecbad5612d04acd98993d5c242c5c40ca59257c1d760002b46ce967" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.974830 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-45e9-account-create-update-8lsk4" event={"ID":"c6923b91-d6e2-4672-be62-2531342086e1","Type":"ContainerDied","Data":"1b4a5280b1f76247fa2664d0ec72fb676cebc6ed901a517619c5cd8278a37c2a"} Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.974866 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b4a5280b1f76247fa2664d0ec72fb676cebc6ed901a517619c5cd8278a37c2a" Dec 08 09:24:18 crc kubenswrapper[4776]: I1208 09:24:18.974939 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-45e9-account-create-update-8lsk4" Dec 08 09:24:19 crc kubenswrapper[4776]: I1208 09:24:19.034397 4776 generic.go:334] "Generic (PLEG): container finished" podID="5097e9b7-c005-4e68-bc20-bbd6f8b8a290" containerID="41725b711dd1839ad256cdbebf464427d14539155fc5d5914b3ae781490b9040" exitCode=0 Dec 08 09:24:19 crc kubenswrapper[4776]: I1208 09:24:19.034479 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" event={"ID":"5097e9b7-c005-4e68-bc20-bbd6f8b8a290","Type":"ContainerDied","Data":"41725b711dd1839ad256cdbebf464427d14539155fc5d5914b3ae781490b9040"} Dec 08 09:24:19 crc kubenswrapper[4776]: I1208 09:24:19.605631 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 09:24:19 crc kubenswrapper[4776]: I1208 09:24:19.769925 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1a23-account-create-update-cldmj" Dec 08 09:24:19 crc kubenswrapper[4776]: I1208 09:24:19.839555 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:24:19 crc kubenswrapper[4776]: I1208 09:24:19.912972 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9drl4\" (UniqueName: \"kubernetes.io/projected/f6727a0d-5792-4bf3-9d9b-a84ad470ba82-kube-api-access-9drl4\") pod \"f6727a0d-5792-4bf3-9d9b-a84ad470ba82\" (UID: \"f6727a0d-5792-4bf3-9d9b-a84ad470ba82\") " Dec 08 09:24:19 crc kubenswrapper[4776]: I1208 09:24:19.923442 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6727a0d-5792-4bf3-9d9b-a84ad470ba82-operator-scripts\") pod \"f6727a0d-5792-4bf3-9d9b-a84ad470ba82\" (UID: \"f6727a0d-5792-4bf3-9d9b-a84ad470ba82\") " Dec 08 09:24:19 crc kubenswrapper[4776]: I1208 09:24:19.927386 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6727a0d-5792-4bf3-9d9b-a84ad470ba82-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6727a0d-5792-4bf3-9d9b-a84ad470ba82" (UID: "f6727a0d-5792-4bf3-9d9b-a84ad470ba82"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:24:19 crc kubenswrapper[4776]: I1208 09:24:19.948442 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6727a0d-5792-4bf3-9d9b-a84ad470ba82-kube-api-access-9drl4" (OuterVolumeSpecName: "kube-api-access-9drl4") pod "f6727a0d-5792-4bf3-9d9b-a84ad470ba82" (UID: "f6727a0d-5792-4bf3-9d9b-a84ad470ba82"). InnerVolumeSpecName "kube-api-access-9drl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.025555 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-ovsdbserver-nb\") pod \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.025611 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-config\") pod \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.025844 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-dns-swift-storage-0\") pod \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.025925 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-ovsdbserver-sb\") pod \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.026016 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-dns-svc\") pod \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.026062 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hffl9\" (UniqueName: \"kubernetes.io/projected/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-kube-api-access-hffl9\") pod \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\" (UID: \"5097e9b7-c005-4e68-bc20-bbd6f8b8a290\") " Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.027345 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9drl4\" (UniqueName: \"kubernetes.io/projected/f6727a0d-5792-4bf3-9d9b-a84ad470ba82-kube-api-access-9drl4\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.027363 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6727a0d-5792-4bf3-9d9b-a84ad470ba82-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.055039 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-kube-api-access-hffl9" (OuterVolumeSpecName: "kube-api-access-hffl9") pod "5097e9b7-c005-4e68-bc20-bbd6f8b8a290" (UID: "5097e9b7-c005-4e68-bc20-bbd6f8b8a290"). InnerVolumeSpecName "kube-api-access-hffl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.084603 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-886cc84d4-qdcjh" event={"ID":"773f52b6-826d-4179-8777-96d795b10c5d","Type":"ContainerStarted","Data":"b3d4237d2a4255a817185daec7a7b389fc523fd448f635c8acf86fe4c83e9927"} Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.084871 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.104098 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-785fc66866-ns7kd" event={"ID":"edc53406-6d30-4981-aa38-cd183ebf1b7d","Type":"ContainerStarted","Data":"2caff5be80429340ce3fa5f5cad2f9ee571a68d4d8f2e2b0c94ae42de7b116b9"} Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.105488 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.129471 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"28ffab6e-5596-4c63-b58a-4417489fc47b","Type":"ContainerStarted","Data":"bb2ac61f6c9f9b97c19779054391582067f647663a27b0ffea554d3207645f69"} Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.142809 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hffl9\" (UniqueName: \"kubernetes.io/projected/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-kube-api-access-hffl9\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.150753 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" event={"ID":"5097e9b7-c005-4e68-bc20-bbd6f8b8a290","Type":"ContainerDied","Data":"1fcb3450aab63ec94c4d26b51f42fe4caef6b950f98b83e005cdc04fc9beb12c"} Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.150805 4776 scope.go:117] "RemoveContainer" containerID="41725b711dd1839ad256cdbebf464427d14539155fc5d5914b3ae781490b9040" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.150909 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5097e9b7-c005-4e68-bc20-bbd6f8b8a290" (UID: "5097e9b7-c005-4e68-bc20-bbd6f8b8a290"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.150931 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-lpj75" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.158789 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1a23-account-create-update-cldmj" event={"ID":"f6727a0d-5792-4bf3-9d9b-a84ad470ba82","Type":"ContainerDied","Data":"bc7805b99b906b10b65c01174402cd504a2e4ca4e36bf23884b239f80c134d08"} Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.158851 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc7805b99b906b10b65c01174402cd504a2e4ca4e36bf23884b239f80c134d08" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.158909 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1a23-account-create-update-cldmj" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.198375 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5097e9b7-c005-4e68-bc20-bbd6f8b8a290" (UID: "5097e9b7-c005-4e68-bc20-bbd6f8b8a290"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.210647 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5097e9b7-c005-4e68-bc20-bbd6f8b8a290" (UID: "5097e9b7-c005-4e68-bc20-bbd6f8b8a290"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.246756 4776 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.246787 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.246796 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.261496 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-config" (OuterVolumeSpecName: "config") pod "5097e9b7-c005-4e68-bc20-bbd6f8b8a290" (UID: "5097e9b7-c005-4e68-bc20-bbd6f8b8a290"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.275064 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5097e9b7-c005-4e68-bc20-bbd6f8b8a290" (UID: "5097e9b7-c005-4e68-bc20-bbd6f8b8a290"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.302128 4776 scope.go:117] "RemoveContainer" containerID="641d43bd51a47d050badbac4cbf44680b1e65636ade4a76fef3a08cba745fb00" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.348641 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.348670 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5097e9b7-c005-4e68-bc20-bbd6f8b8a290-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.555690 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lpj75"] Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.578388 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lpj75"] Dec 08 09:24:20 crc kubenswrapper[4776]: E1208 09:24:20.758137 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedc53406_6d30_4981_aa38_cd183ebf1b7d.slice/crio-conmon-2caff5be80429340ce3fa5f5cad2f9ee571a68d4d8f2e2b0c94ae42de7b116b9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5097e9b7_c005_4e68_bc20_bbd6f8b8a290.slice/crio-1fcb3450aab63ec94c4d26b51f42fe4caef6b950f98b83e005cdc04fc9beb12c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedc53406_6d30_4981_aa38_cd183ebf1b7d.slice/crio-2caff5be80429340ce3fa5f5cad2f9ee571a68d4d8f2e2b0c94ae42de7b116b9.scope\": RecentStats: unable to find data in memory cache]" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.900881 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5577b84758-k9tb2"] Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.901236 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-5577b84758-k9tb2" podUID="c99fb0c4-9b67-42b9-87d6-13ae72903740" containerName="heat-api" containerID="cri-o://44910220aaf6fd61dafa8f059ae0cf4d984506ec8cfd361a11dc0b75451b76a1" gracePeriod=60 Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.925333 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-788bcdcb6b-kpzht"] Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.926100 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" podUID="39ad8a82-0a3f-4f21-bf0f-a158bd903618" containerName="heat-cfnapi" containerID="cri-o://d1a34192616e1e482d67bb8764abc9f5a5fd6e2697c8c86df334f238a15a4bd4" gracePeriod=60 Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.937567 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" podUID="39ad8a82-0a3f-4f21-bf0f-a158bd903618" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.209:8000/healthcheck\": EOF" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.937698 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" podUID="39ad8a82-0a3f-4f21-bf0f-a158bd903618" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.209:8000/healthcheck\": EOF" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.951730 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7b4dd4bc68-bwx4q"] Dec 08 09:24:20 crc kubenswrapper[4776]: E1208 09:24:20.952349 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5097e9b7-c005-4e68-bc20-bbd6f8b8a290" containerName="init" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.952362 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5097e9b7-c005-4e68-bc20-bbd6f8b8a290" containerName="init" Dec 08 09:24:20 crc kubenswrapper[4776]: E1208 09:24:20.952374 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5097e9b7-c005-4e68-bc20-bbd6f8b8a290" containerName="dnsmasq-dns" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.952392 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5097e9b7-c005-4e68-bc20-bbd6f8b8a290" containerName="dnsmasq-dns" Dec 08 09:24:20 crc kubenswrapper[4776]: E1208 09:24:20.952428 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6923b91-d6e2-4672-be62-2531342086e1" containerName="mariadb-account-create-update" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.952436 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6923b91-d6e2-4672-be62-2531342086e1" containerName="mariadb-account-create-update" Dec 08 09:24:20 crc kubenswrapper[4776]: E1208 09:24:20.952448 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6727a0d-5792-4bf3-9d9b-a84ad470ba82" containerName="mariadb-account-create-update" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.952453 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6727a0d-5792-4bf3-9d9b-a84ad470ba82" containerName="mariadb-account-create-update" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.952699 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5097e9b7-c005-4e68-bc20-bbd6f8b8a290" containerName="dnsmasq-dns" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.952718 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6923b91-d6e2-4672-be62-2531342086e1" containerName="mariadb-account-create-update" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.952735 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6727a0d-5792-4bf3-9d9b-a84ad470ba82" containerName="mariadb-account-create-update" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.953701 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.956085 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.958666 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.983206 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-5577b84758-k9tb2" podUID="c99fb0c4-9b67-42b9-87d6-13ae72903740" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.211:8004/healthcheck\": EOF" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.983536 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-api-5577b84758-k9tb2" podUID="c99fb0c4-9b67-42b9-87d6-13ae72903740" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.211:8004/healthcheck\": EOF" Dec 08 09:24:20 crc kubenswrapper[4776]: I1208 09:24:20.991413 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-58d55cd687-srfgj"] Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:20.992921 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.006893 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.007284 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.054462 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7b4dd4bc68-bwx4q"] Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.080596 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-combined-ca-bundle\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.081370 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-config-data-custom\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.081400 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-public-tls-certs\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.081602 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-config-data\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.081688 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-internal-tls-certs\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.081721 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-config-data-custom\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.081761 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-combined-ca-bundle\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.081956 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-config-data\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.082038 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qhmd\" (UniqueName: \"kubernetes.io/projected/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-kube-api-access-4qhmd\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.082067 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-internal-tls-certs\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.082128 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-public-tls-certs\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.082158 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx6p7\" (UniqueName: \"kubernetes.io/projected/dcbcd000-25be-4f44-8114-7602c348b58d-kube-api-access-bx6p7\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.095113 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="dcb1d701-bc05-4d4b-8794-ebc4af6da8ba" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.207:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.105575 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-58d55cd687-srfgj"] Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.197598 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"28ffab6e-5596-4c63-b58a-4417489fc47b","Type":"ContainerStarted","Data":"8802bb9807628a744e61f314edf39f1ad9b5560cf3090282d17bdf2fa23752ae"} Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.202459 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-public-tls-certs\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.202529 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx6p7\" (UniqueName: \"kubernetes.io/projected/dcbcd000-25be-4f44-8114-7602c348b58d-kube-api-access-bx6p7\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.202754 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-combined-ca-bundle\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.207779 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-config-data-custom\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.207810 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-public-tls-certs\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.207847 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-config-data\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.207884 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-internal-tls-certs\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.207915 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-config-data-custom\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.207952 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-combined-ca-bundle\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.208108 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-config-data\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.211907 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-combined-ca-bundle\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.214828 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qhmd\" (UniqueName: \"kubernetes.io/projected/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-kube-api-access-4qhmd\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.216362 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-internal-tls-certs\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.217254 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-config-data\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.220710 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-public-tls-certs\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.226785 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-config-data\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.230033 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-public-tls-certs\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.236896 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-config-data-custom\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.237632 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-internal-tls-certs\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.238404 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-internal-tls-certs\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.238603 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-config-data-custom\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.239593 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx6p7\" (UniqueName: \"kubernetes.io/projected/dcbcd000-25be-4f44-8114-7602c348b58d-kube-api-access-bx6p7\") pod \"heat-api-7b4dd4bc68-bwx4q\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.239713 4776 generic.go:334] "Generic (PLEG): container finished" podID="773f52b6-826d-4179-8777-96d795b10c5d" containerID="b3d4237d2a4255a817185daec7a7b389fc523fd448f635c8acf86fe4c83e9927" exitCode=1 Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.239759 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-886cc84d4-qdcjh" event={"ID":"773f52b6-826d-4179-8777-96d795b10c5d","Type":"ContainerDied","Data":"b3d4237d2a4255a817185daec7a7b389fc523fd448f635c8acf86fe4c83e9927"} Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.239790 4776 scope.go:117] "RemoveContainer" containerID="b9345d39e1a66e29cb6052c1d6c7b43edd9dc2df0f1afe69e91fbc71033257f5" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.244471 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-combined-ca-bundle\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.245191 4776 scope.go:117] "RemoveContainer" containerID="b3d4237d2a4255a817185daec7a7b389fc523fd448f635c8acf86fe4c83e9927" Dec 08 09:24:21 crc kubenswrapper[4776]: E1208 09:24:21.245529 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-886cc84d4-qdcjh_openstack(773f52b6-826d-4179-8777-96d795b10c5d)\"" pod="openstack/heat-api-886cc84d4-qdcjh" podUID="773f52b6-826d-4179-8777-96d795b10c5d" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.247873 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qhmd\" (UniqueName: \"kubernetes.io/projected/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-kube-api-access-4qhmd\") pod \"heat-cfnapi-58d55cd687-srfgj\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.282020 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.283301 4776 generic.go:334] "Generic (PLEG): container finished" podID="edc53406-6d30-4981-aa38-cd183ebf1b7d" containerID="2caff5be80429340ce3fa5f5cad2f9ee571a68d4d8f2e2b0c94ae42de7b116b9" exitCode=1 Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.283392 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-785fc66866-ns7kd" event={"ID":"edc53406-6d30-4981-aa38-cd183ebf1b7d","Type":"ContainerDied","Data":"2caff5be80429340ce3fa5f5cad2f9ee571a68d4d8f2e2b0c94ae42de7b116b9"} Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.283785 4776 scope.go:117] "RemoveContainer" containerID="2caff5be80429340ce3fa5f5cad2f9ee571a68d4d8f2e2b0c94ae42de7b116b9" Dec 08 09:24:21 crc kubenswrapper[4776]: E1208 09:24:21.284069 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-785fc66866-ns7kd_openstack(edc53406-6d30-4981-aa38-cd183ebf1b7d)\"" pod="openstack/heat-cfnapi-785fc66866-ns7kd" podUID="edc53406-6d30-4981-aa38-cd183ebf1b7d" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.373671 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.382237 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zw8wj"] Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.383753 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zw8wj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.388562 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.399718 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zw8wj"] Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.402640 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.402806 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-frkls" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.435665 4776 scope.go:117] "RemoveContainer" containerID="52ca12eabecbad5612d04acd98993d5c242c5c40ca59257c1d760002b46ce967" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.536289 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-config-data\") pod \"nova-cell0-conductor-db-sync-zw8wj\" (UID: \"c2c03694-357a-4838-8202-7e3d3196f9ca\") " pod="openstack/nova-cell0-conductor-db-sync-zw8wj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.536634 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zw8wj\" (UID: \"c2c03694-357a-4838-8202-7e3d3196f9ca\") " pod="openstack/nova-cell0-conductor-db-sync-zw8wj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.536725 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7tnb\" (UniqueName: \"kubernetes.io/projected/c2c03694-357a-4838-8202-7e3d3196f9ca-kube-api-access-d7tnb\") pod \"nova-cell0-conductor-db-sync-zw8wj\" (UID: \"c2c03694-357a-4838-8202-7e3d3196f9ca\") " pod="openstack/nova-cell0-conductor-db-sync-zw8wj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.536764 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-scripts\") pod \"nova-cell0-conductor-db-sync-zw8wj\" (UID: \"c2c03694-357a-4838-8202-7e3d3196f9ca\") " pod="openstack/nova-cell0-conductor-db-sync-zw8wj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.639044 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-scripts\") pod \"nova-cell0-conductor-db-sync-zw8wj\" (UID: \"c2c03694-357a-4838-8202-7e3d3196f9ca\") " pod="openstack/nova-cell0-conductor-db-sync-zw8wj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.650811 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-scripts\") pod \"nova-cell0-conductor-db-sync-zw8wj\" (UID: \"c2c03694-357a-4838-8202-7e3d3196f9ca\") " pod="openstack/nova-cell0-conductor-db-sync-zw8wj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.651048 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-config-data\") pod \"nova-cell0-conductor-db-sync-zw8wj\" (UID: \"c2c03694-357a-4838-8202-7e3d3196f9ca\") " pod="openstack/nova-cell0-conductor-db-sync-zw8wj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.651516 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zw8wj\" (UID: \"c2c03694-357a-4838-8202-7e3d3196f9ca\") " pod="openstack/nova-cell0-conductor-db-sync-zw8wj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.651676 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7tnb\" (UniqueName: \"kubernetes.io/projected/c2c03694-357a-4838-8202-7e3d3196f9ca-kube-api-access-d7tnb\") pod \"nova-cell0-conductor-db-sync-zw8wj\" (UID: \"c2c03694-357a-4838-8202-7e3d3196f9ca\") " pod="openstack/nova-cell0-conductor-db-sync-zw8wj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.657275 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zw8wj\" (UID: \"c2c03694-357a-4838-8202-7e3d3196f9ca\") " pod="openstack/nova-cell0-conductor-db-sync-zw8wj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.659484 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-config-data\") pod \"nova-cell0-conductor-db-sync-zw8wj\" (UID: \"c2c03694-357a-4838-8202-7e3d3196f9ca\") " pod="openstack/nova-cell0-conductor-db-sync-zw8wj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.678710 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7tnb\" (UniqueName: \"kubernetes.io/projected/c2c03694-357a-4838-8202-7e3d3196f9ca-kube-api-access-d7tnb\") pod \"nova-cell0-conductor-db-sync-zw8wj\" (UID: \"c2c03694-357a-4838-8202-7e3d3196f9ca\") " pod="openstack/nova-cell0-conductor-db-sync-zw8wj" Dec 08 09:24:21 crc kubenswrapper[4776]: I1208 09:24:21.755941 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zw8wj" Dec 08 09:24:22 crc kubenswrapper[4776]: I1208 09:24:22.020758 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7b4dd4bc68-bwx4q"] Dec 08 09:24:22 crc kubenswrapper[4776]: I1208 09:24:22.087435 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="dcb1d701-bc05-4d4b-8794-ebc4af6da8ba" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.207:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 08 09:24:22 crc kubenswrapper[4776]: I1208 09:24:22.269617 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-58d55cd687-srfgj"] Dec 08 09:24:22 crc kubenswrapper[4776]: W1208 09:24:22.281142 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc51b16c8_d0c8_4109_8cb3_f1799ce5f996.slice/crio-8349833288b381f00a77a298518f5ada646e391d674bd52054cfc0ccdf689c08 WatchSource:0}: Error finding container 8349833288b381f00a77a298518f5ada646e391d674bd52054cfc0ccdf689c08: Status 404 returned error can't find the container with id 8349833288b381f00a77a298518f5ada646e391d674bd52054cfc0ccdf689c08 Dec 08 09:24:22 crc kubenswrapper[4776]: I1208 09:24:22.311654 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7b4dd4bc68-bwx4q" event={"ID":"dcbcd000-25be-4f44-8114-7602c348b58d","Type":"ContainerStarted","Data":"31815974efb56de87ec196dc1a7062ef90234e7777529c7e62e57038509af189"} Dec 08 09:24:22 crc kubenswrapper[4776]: I1208 09:24:22.327124 4776 scope.go:117] "RemoveContainer" containerID="b3d4237d2a4255a817185daec7a7b389fc523fd448f635c8acf86fe4c83e9927" Dec 08 09:24:22 crc kubenswrapper[4776]: E1208 09:24:22.327412 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-886cc84d4-qdcjh_openstack(773f52b6-826d-4179-8777-96d795b10c5d)\"" pod="openstack/heat-api-886cc84d4-qdcjh" podUID="773f52b6-826d-4179-8777-96d795b10c5d" Dec 08 09:24:22 crc kubenswrapper[4776]: I1208 09:24:22.350706 4776 scope.go:117] "RemoveContainer" containerID="2caff5be80429340ce3fa5f5cad2f9ee571a68d4d8f2e2b0c94ae42de7b116b9" Dec 08 09:24:22 crc kubenswrapper[4776]: E1208 09:24:22.350937 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-785fc66866-ns7kd_openstack(edc53406-6d30-4981-aa38-cd183ebf1b7d)\"" pod="openstack/heat-cfnapi-785fc66866-ns7kd" podUID="edc53406-6d30-4981-aa38-cd183ebf1b7d" Dec 08 09:24:22 crc kubenswrapper[4776]: I1208 09:24:22.401800 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5097e9b7-c005-4e68-bc20-bbd6f8b8a290" path="/var/lib/kubelet/pods/5097e9b7-c005-4e68-bc20-bbd6f8b8a290/volumes" Dec 08 09:24:22 crc kubenswrapper[4776]: I1208 09:24:22.680308 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zw8wj"] Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.429328 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58d55cd687-srfgj" event={"ID":"c51b16c8-d0c8-4109-8cb3-f1799ce5f996","Type":"ContainerStarted","Data":"3b0ac0b8fd0ac6f80f7fc079e316e648e1a204fe4a69bd0cda6755200789a450"} Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.429816 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58d55cd687-srfgj" event={"ID":"c51b16c8-d0c8-4109-8cb3-f1799ce5f996","Type":"ContainerStarted","Data":"8349833288b381f00a77a298518f5ada646e391d674bd52054cfc0ccdf689c08"} Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.431235 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.438955 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7b4dd4bc68-bwx4q" event={"ID":"dcbcd000-25be-4f44-8114-7602c348b58d","Type":"ContainerStarted","Data":"7ac51cbf9a132bb867cd71c7f2264fe5626030d540e35cb954106c984aeff1bf"} Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.440052 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.448735 4776 generic.go:334] "Generic (PLEG): container finished" podID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerID="ca7646b3e4cd7851105e1b476bc4c75e76c7bad3c112fbae7638b166ccb69971" exitCode=0 Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.448777 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41881b6c-dcfb-4d67-ad0c-f0e003837c8e","Type":"ContainerDied","Data":"ca7646b3e4cd7851105e1b476bc4c75e76c7bad3c112fbae7638b166ccb69971"} Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.456056 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zw8wj" event={"ID":"c2c03694-357a-4838-8202-7e3d3196f9ca","Type":"ContainerStarted","Data":"1c5ae13ab7404f4aefec5858b674d29b544e7449cee1b009f71982d1e5089ee3"} Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.462026 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"28ffab6e-5596-4c63-b58a-4417489fc47b","Type":"ContainerStarted","Data":"d3bd18210fe129ef7bc8e3c9d382c703953589872c8e987fc60caec180b13f63"} Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.475908 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-58d55cd687-srfgj" podStartSLOduration=3.475879844 podStartE2EDuration="3.475879844s" podCreationTimestamp="2025-12-08 09:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:23.463046981 +0000 UTC m=+1539.726272003" watchObservedRunningTime="2025-12-08 09:24:23.475879844 +0000 UTC m=+1539.739104886" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.497930 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7b4dd4bc68-bwx4q" podStartSLOduration=3.497908776 podStartE2EDuration="3.497908776s" podCreationTimestamp="2025-12-08 09:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:23.480599711 +0000 UTC m=+1539.743824733" watchObservedRunningTime="2025-12-08 09:24:23.497908776 +0000 UTC m=+1539.761133798" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.556535 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.5565140280000005 podStartE2EDuration="6.556514028s" podCreationTimestamp="2025-12-08 09:24:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:23.52450501 +0000 UTC m=+1539.787730032" watchObservedRunningTime="2025-12-08 09:24:23.556514028 +0000 UTC m=+1539.819739050" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.639316 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.717404 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-sg-core-conf-yaml\") pod \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.717459 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-config-data\") pod \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.717550 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-log-httpd\") pod \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.717571 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-run-httpd\") pod \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.717602 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-scripts\") pod \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.717718 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trcxh\" (UniqueName: \"kubernetes.io/projected/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-kube-api-access-trcxh\") pod \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.717823 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-combined-ca-bundle\") pod \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\" (UID: \"41881b6c-dcfb-4d67-ad0c-f0e003837c8e\") " Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.720359 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "41881b6c-dcfb-4d67-ad0c-f0e003837c8e" (UID: "41881b6c-dcfb-4d67-ad0c-f0e003837c8e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.722479 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "41881b6c-dcfb-4d67-ad0c-f0e003837c8e" (UID: "41881b6c-dcfb-4d67-ad0c-f0e003837c8e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.727792 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-scripts" (OuterVolumeSpecName: "scripts") pod "41881b6c-dcfb-4d67-ad0c-f0e003837c8e" (UID: "41881b6c-dcfb-4d67-ad0c-f0e003837c8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.742301 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-kube-api-access-trcxh" (OuterVolumeSpecName: "kube-api-access-trcxh") pod "41881b6c-dcfb-4d67-ad0c-f0e003837c8e" (UID: "41881b6c-dcfb-4d67-ad0c-f0e003837c8e"). InnerVolumeSpecName "kube-api-access-trcxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.774271 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "41881b6c-dcfb-4d67-ad0c-f0e003837c8e" (UID: "41881b6c-dcfb-4d67-ad0c-f0e003837c8e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.822526 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.822559 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.822567 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.822575 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trcxh\" (UniqueName: \"kubernetes.io/projected/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-kube-api-access-trcxh\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.822585 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.868515 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-config-data" (OuterVolumeSpecName: "config-data") pod "41881b6c-dcfb-4d67-ad0c-f0e003837c8e" (UID: "41881b6c-dcfb-4d67-ad0c-f0e003837c8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.872276 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.872319 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.881413 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41881b6c-dcfb-4d67-ad0c-f0e003837c8e" (UID: "41881b6c-dcfb-4d67-ad0c-f0e003837c8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.919908 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.926485 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.932243 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:23 crc kubenswrapper[4776]: I1208 09:24:23.932276 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41881b6c-dcfb-4d67-ad0c-f0e003837c8e-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.474349 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41881b6c-dcfb-4d67-ad0c-f0e003837c8e","Type":"ContainerDied","Data":"9d91170523c7bc798339025e32cbdd6c44ea3babde4927847739101e04630e5b"} Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.474631 4776 scope.go:117] "RemoveContainer" containerID="bc6b0c69d05e7a2d352f35926981925451c42706a2be9d9182d422c4ef4f2cd5" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.475372 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.475400 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.475426 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.520003 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.526317 4776 scope.go:117] "RemoveContainer" containerID="91642be302f5531e66f9c1e5ce36663bb8a9a07641f46cc0e66d481153bc4238" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.565320 4776 scope.go:117] "RemoveContainer" containerID="2170701a28be7b605cdcde1013dd35b1fa0ea062119d4370a269937c5c0b27ec" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.576296 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.586078 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:24:24 crc kubenswrapper[4776]: E1208 09:24:24.586639 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerName="proxy-httpd" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.586657 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerName="proxy-httpd" Dec 08 09:24:24 crc kubenswrapper[4776]: E1208 09:24:24.586678 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerName="ceilometer-notification-agent" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.586687 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerName="ceilometer-notification-agent" Dec 08 09:24:24 crc kubenswrapper[4776]: E1208 09:24:24.586705 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerName="sg-core" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.586713 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerName="sg-core" Dec 08 09:24:24 crc kubenswrapper[4776]: E1208 09:24:24.586729 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerName="ceilometer-central-agent" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.586735 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerName="ceilometer-central-agent" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.586962 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerName="ceilometer-central-agent" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.586987 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerName="sg-core" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.587004 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerName="ceilometer-notification-agent" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.587016 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" containerName="proxy-httpd" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.589166 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.593667 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.593898 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.594500 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.598039 4776 scope.go:117] "RemoveContainer" containerID="ca7646b3e4cd7851105e1b476bc4c75e76c7bad3c112fbae7638b166ccb69971" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.748576 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfea15ec-7a58-4283-831a-fdfb2c03918c-run-httpd\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.748640 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfea15ec-7a58-4283-831a-fdfb2c03918c-log-httpd\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.748685 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-scripts\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.748714 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.748793 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.749027 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-config-data\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.749068 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbl79\" (UniqueName: \"kubernetes.io/projected/cfea15ec-7a58-4283-831a-fdfb2c03918c-kube-api-access-rbl79\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.851388 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-scripts\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.851785 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.851859 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.851933 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-config-data\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.851957 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbl79\" (UniqueName: \"kubernetes.io/projected/cfea15ec-7a58-4283-831a-fdfb2c03918c-kube-api-access-rbl79\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.852061 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfea15ec-7a58-4283-831a-fdfb2c03918c-run-httpd\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.852101 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfea15ec-7a58-4283-831a-fdfb2c03918c-log-httpd\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.852561 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfea15ec-7a58-4283-831a-fdfb2c03918c-log-httpd\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.853920 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfea15ec-7a58-4283-831a-fdfb2c03918c-run-httpd\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.857693 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.857856 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-scripts\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.858273 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.867200 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-config-data\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.869168 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbl79\" (UniqueName: \"kubernetes.io/projected/cfea15ec-7a58-4283-831a-fdfb2c03918c-kube-api-access-rbl79\") pod \"ceilometer-0\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.913505 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:24:24 crc kubenswrapper[4776]: I1208 09:24:24.976772 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 08 09:24:25 crc kubenswrapper[4776]: I1208 09:24:25.439154 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" podUID="39ad8a82-0a3f-4f21-bf0f-a158bd903618" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.209:8000/healthcheck\": read tcp 10.217.0.2:59376->10.217.0.209:8000: read: connection reset by peer" Dec 08 09:24:25 crc kubenswrapper[4776]: I1208 09:24:25.440101 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" podUID="39ad8a82-0a3f-4f21-bf0f-a158bd903618" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.209:8000/healthcheck\": dial tcp 10.217.0.209:8000: connect: connection refused" Dec 08 09:24:25 crc kubenswrapper[4776]: I1208 09:24:25.443663 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-5577b84758-k9tb2" podUID="c99fb0c4-9b67-42b9-87d6-13ae72903740" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.211:8004/healthcheck\": read tcp 10.217.0.2:45392->10.217.0.211:8004: read: connection reset by peer" Dec 08 09:24:25 crc kubenswrapper[4776]: I1208 09:24:25.444415 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-5577b84758-k9tb2" podUID="c99fb0c4-9b67-42b9-87d6-13ae72903740" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.211:8004/healthcheck\": dial tcp 10.217.0.211:8004: connect: connection refused" Dec 08 09:24:25 crc kubenswrapper[4776]: I1208 09:24:25.508504 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:25 crc kubenswrapper[4776]: I1208 09:24:25.509250 4776 scope.go:117] "RemoveContainer" containerID="b3d4237d2a4255a817185daec7a7b389fc523fd448f635c8acf86fe4c83e9927" Dec 08 09:24:25 crc kubenswrapper[4776]: E1208 09:24:25.509548 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-886cc84d4-qdcjh_openstack(773f52b6-826d-4179-8777-96d795b10c5d)\"" pod="openstack/heat-api-886cc84d4-qdcjh" podUID="773f52b6-826d-4179-8777-96d795b10c5d" Dec 08 09:24:25 crc kubenswrapper[4776]: I1208 09:24:25.541923 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:25 crc kubenswrapper[4776]: I1208 09:24:25.543219 4776 scope.go:117] "RemoveContainer" containerID="2caff5be80429340ce3fa5f5cad2f9ee571a68d4d8f2e2b0c94ae42de7b116b9" Dec 08 09:24:25 crc kubenswrapper[4776]: E1208 09:24:25.543799 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-785fc66866-ns7kd_openstack(edc53406-6d30-4981-aa38-cd183ebf1b7d)\"" pod="openstack/heat-cfnapi-785fc66866-ns7kd" podUID="edc53406-6d30-4981-aa38-cd183ebf1b7d" Dec 08 09:24:25 crc kubenswrapper[4776]: I1208 09:24:25.705137 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.246653 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.304841 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.366019 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41881b6c-dcfb-4d67-ad0c-f0e003837c8e" path="/var/lib/kubelet/pods/41881b6c-dcfb-4d67-ad0c-f0e003837c8e/volumes" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.372885 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.417060 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-combined-ca-bundle\") pod \"c99fb0c4-9b67-42b9-87d6-13ae72903740\" (UID: \"c99fb0c4-9b67-42b9-87d6-13ae72903740\") " Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.417160 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-config-data\") pod \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\" (UID: \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\") " Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.417205 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-config-data-custom\") pod \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\" (UID: \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\") " Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.417296 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdvrk\" (UniqueName: \"kubernetes.io/projected/c99fb0c4-9b67-42b9-87d6-13ae72903740-kube-api-access-zdvrk\") pod \"c99fb0c4-9b67-42b9-87d6-13ae72903740\" (UID: \"c99fb0c4-9b67-42b9-87d6-13ae72903740\") " Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.417418 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-config-data\") pod \"c99fb0c4-9b67-42b9-87d6-13ae72903740\" (UID: \"c99fb0c4-9b67-42b9-87d6-13ae72903740\") " Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.417453 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b7vf\" (UniqueName: \"kubernetes.io/projected/39ad8a82-0a3f-4f21-bf0f-a158bd903618-kube-api-access-2b7vf\") pod \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\" (UID: \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\") " Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.417545 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-config-data-custom\") pod \"c99fb0c4-9b67-42b9-87d6-13ae72903740\" (UID: \"c99fb0c4-9b67-42b9-87d6-13ae72903740\") " Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.417635 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-combined-ca-bundle\") pod \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\" (UID: \"39ad8a82-0a3f-4f21-bf0f-a158bd903618\") " Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.428320 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39ad8a82-0a3f-4f21-bf0f-a158bd903618-kube-api-access-2b7vf" (OuterVolumeSpecName: "kube-api-access-2b7vf") pod "39ad8a82-0a3f-4f21-bf0f-a158bd903618" (UID: "39ad8a82-0a3f-4f21-bf0f-a158bd903618"). InnerVolumeSpecName "kube-api-access-2b7vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.428391 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c99fb0c4-9b67-42b9-87d6-13ae72903740" (UID: "c99fb0c4-9b67-42b9-87d6-13ae72903740"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.430654 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "39ad8a82-0a3f-4f21-bf0f-a158bd903618" (UID: "39ad8a82-0a3f-4f21-bf0f-a158bd903618"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.434610 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c99fb0c4-9b67-42b9-87d6-13ae72903740-kube-api-access-zdvrk" (OuterVolumeSpecName: "kube-api-access-zdvrk") pod "c99fb0c4-9b67-42b9-87d6-13ae72903740" (UID: "c99fb0c4-9b67-42b9-87d6-13ae72903740"). InnerVolumeSpecName "kube-api-access-zdvrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.507646 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c99fb0c4-9b67-42b9-87d6-13ae72903740" (UID: "c99fb0c4-9b67-42b9-87d6-13ae72903740"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.508658 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39ad8a82-0a3f-4f21-bf0f-a158bd903618" (UID: "39ad8a82-0a3f-4f21-bf0f-a158bd903618"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.521968 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b7vf\" (UniqueName: \"kubernetes.io/projected/39ad8a82-0a3f-4f21-bf0f-a158bd903618-kube-api-access-2b7vf\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.523808 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.523896 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.523959 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.524023 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.524079 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdvrk\" (UniqueName: \"kubernetes.io/projected/c99fb0c4-9b67-42b9-87d6-13ae72903740-kube-api-access-zdvrk\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.529301 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-config-data" (OuterVolumeSpecName: "config-data") pod "39ad8a82-0a3f-4f21-bf0f-a158bd903618" (UID: "39ad8a82-0a3f-4f21-bf0f-a158bd903618"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.536497 4776 generic.go:334] "Generic (PLEG): container finished" podID="39ad8a82-0a3f-4f21-bf0f-a158bd903618" containerID="d1a34192616e1e482d67bb8764abc9f5a5fd6e2697c8c86df334f238a15a4bd4" exitCode=0 Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.536562 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" event={"ID":"39ad8a82-0a3f-4f21-bf0f-a158bd903618","Type":"ContainerDied","Data":"d1a34192616e1e482d67bb8764abc9f5a5fd6e2697c8c86df334f238a15a4bd4"} Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.536587 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" event={"ID":"39ad8a82-0a3f-4f21-bf0f-a158bd903618","Type":"ContainerDied","Data":"5eaae633aa8440750387eadc0f0b2be7cb8e80657502c2c036b055bea45ab711"} Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.536603 4776 scope.go:117] "RemoveContainer" containerID="d1a34192616e1e482d67bb8764abc9f5a5fd6e2697c8c86df334f238a15a4bd4" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.536737 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-788bcdcb6b-kpzht" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.548477 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfea15ec-7a58-4283-831a-fdfb2c03918c","Type":"ContainerStarted","Data":"b5fc4f7297ced84673c6a0f935b668a11655f90f300d50c9876bdf1e8b9217b3"} Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.548639 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfea15ec-7a58-4283-831a-fdfb2c03918c","Type":"ContainerStarted","Data":"c77951c107e18990ff33d002be46dcf0cd726f856d2efc98783f338e047f3ccc"} Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.551896 4776 generic.go:334] "Generic (PLEG): container finished" podID="c99fb0c4-9b67-42b9-87d6-13ae72903740" containerID="44910220aaf6fd61dafa8f059ae0cf4d984506ec8cfd361a11dc0b75451b76a1" exitCode=0 Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.551947 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5577b84758-k9tb2" event={"ID":"c99fb0c4-9b67-42b9-87d6-13ae72903740","Type":"ContainerDied","Data":"44910220aaf6fd61dafa8f059ae0cf4d984506ec8cfd361a11dc0b75451b76a1"} Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.551977 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5577b84758-k9tb2" event={"ID":"c99fb0c4-9b67-42b9-87d6-13ae72903740","Type":"ContainerDied","Data":"d8b2ab21242d1c0c73e5111cbcf771aaa60ed8e73381314b307ef97da9c746c5"} Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.552054 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5577b84758-k9tb2" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.559152 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-config-data" (OuterVolumeSpecName: "config-data") pod "c99fb0c4-9b67-42b9-87d6-13ae72903740" (UID: "c99fb0c4-9b67-42b9-87d6-13ae72903740"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.584300 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-788bcdcb6b-kpzht"] Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.595848 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-788bcdcb6b-kpzht"] Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.602867 4776 scope.go:117] "RemoveContainer" containerID="d1a34192616e1e482d67bb8764abc9f5a5fd6e2697c8c86df334f238a15a4bd4" Dec 08 09:24:26 crc kubenswrapper[4776]: E1208 09:24:26.603442 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1a34192616e1e482d67bb8764abc9f5a5fd6e2697c8c86df334f238a15a4bd4\": container with ID starting with d1a34192616e1e482d67bb8764abc9f5a5fd6e2697c8c86df334f238a15a4bd4 not found: ID does not exist" containerID="d1a34192616e1e482d67bb8764abc9f5a5fd6e2697c8c86df334f238a15a4bd4" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.603480 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a34192616e1e482d67bb8764abc9f5a5fd6e2697c8c86df334f238a15a4bd4"} err="failed to get container status \"d1a34192616e1e482d67bb8764abc9f5a5fd6e2697c8c86df334f238a15a4bd4\": rpc error: code = NotFound desc = could not find container \"d1a34192616e1e482d67bb8764abc9f5a5fd6e2697c8c86df334f238a15a4bd4\": container with ID starting with d1a34192616e1e482d67bb8764abc9f5a5fd6e2697c8c86df334f238a15a4bd4 not found: ID does not exist" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.603506 4776 scope.go:117] "RemoveContainer" containerID="44910220aaf6fd61dafa8f059ae0cf4d984506ec8cfd361a11dc0b75451b76a1" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.626505 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c99fb0c4-9b67-42b9-87d6-13ae72903740-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.626538 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ad8a82-0a3f-4f21-bf0f-a158bd903618-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.635205 4776 scope.go:117] "RemoveContainer" containerID="44910220aaf6fd61dafa8f059ae0cf4d984506ec8cfd361a11dc0b75451b76a1" Dec 08 09:24:26 crc kubenswrapper[4776]: E1208 09:24:26.636138 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44910220aaf6fd61dafa8f059ae0cf4d984506ec8cfd361a11dc0b75451b76a1\": container with ID starting with 44910220aaf6fd61dafa8f059ae0cf4d984506ec8cfd361a11dc0b75451b76a1 not found: ID does not exist" containerID="44910220aaf6fd61dafa8f059ae0cf4d984506ec8cfd361a11dc0b75451b76a1" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.636256 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44910220aaf6fd61dafa8f059ae0cf4d984506ec8cfd361a11dc0b75451b76a1"} err="failed to get container status \"44910220aaf6fd61dafa8f059ae0cf4d984506ec8cfd361a11dc0b75451b76a1\": rpc error: code = NotFound desc = could not find container \"44910220aaf6fd61dafa8f059ae0cf4d984506ec8cfd361a11dc0b75451b76a1\": container with ID starting with 44910220aaf6fd61dafa8f059ae0cf4d984506ec8cfd361a11dc0b75451b76a1 not found: ID does not exist" Dec 08 09:24:26 crc kubenswrapper[4776]: I1208 09:24:26.993530 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5577b84758-k9tb2"] Dec 08 09:24:27 crc kubenswrapper[4776]: I1208 09:24:27.003952 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5577b84758-k9tb2"] Dec 08 09:24:27 crc kubenswrapper[4776]: I1208 09:24:27.567995 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfea15ec-7a58-4283-831a-fdfb2c03918c","Type":"ContainerStarted","Data":"d148e2dc18f91d85414476dee2f5a9fc857b69cbe2d63e25d9a31d47ba2264da"} Dec 08 09:24:27 crc kubenswrapper[4776]: I1208 09:24:27.589153 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:27 crc kubenswrapper[4776]: I1208 09:24:27.598140 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 08 09:24:27 crc kubenswrapper[4776]: I1208 09:24:27.598302 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 09:24:27 crc kubenswrapper[4776]: I1208 09:24:27.599331 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 08 09:24:28 crc kubenswrapper[4776]: I1208 09:24:28.360114 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39ad8a82-0a3f-4f21-bf0f-a158bd903618" path="/var/lib/kubelet/pods/39ad8a82-0a3f-4f21-bf0f-a158bd903618/volumes" Dec 08 09:24:28 crc kubenswrapper[4776]: I1208 09:24:28.360927 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c99fb0c4-9b67-42b9-87d6-13ae72903740" path="/var/lib/kubelet/pods/c99fb0c4-9b67-42b9-87d6-13ae72903740/volumes" Dec 08 09:24:28 crc kubenswrapper[4776]: I1208 09:24:28.437520 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 08 09:24:28 crc kubenswrapper[4776]: I1208 09:24:28.437985 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 08 09:24:28 crc kubenswrapper[4776]: I1208 09:24:28.483152 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 08 09:24:28 crc kubenswrapper[4776]: I1208 09:24:28.491696 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 08 09:24:28 crc kubenswrapper[4776]: I1208 09:24:28.590137 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfea15ec-7a58-4283-831a-fdfb2c03918c","Type":"ContainerStarted","Data":"a9fa0769ba2de6cc958985479f709ad8854021d01491c46212649412b74c487a"} Dec 08 09:24:28 crc kubenswrapper[4776]: I1208 09:24:28.590438 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 08 09:24:28 crc kubenswrapper[4776]: I1208 09:24:28.590464 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 08 09:24:30 crc kubenswrapper[4776]: I1208 09:24:30.611143 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 09:24:30 crc kubenswrapper[4776]: I1208 09:24:30.613300 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 09:24:30 crc kubenswrapper[4776]: I1208 09:24:30.628654 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 08 09:24:30 crc kubenswrapper[4776]: I1208 09:24:30.671924 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 08 09:24:33 crc kubenswrapper[4776]: I1208 09:24:33.075827 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:24:33 crc kubenswrapper[4776]: I1208 09:24:33.151224 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-785fc66866-ns7kd"] Dec 08 09:24:33 crc kubenswrapper[4776]: I1208 09:24:33.339954 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:24:33 crc kubenswrapper[4776]: I1208 09:24:33.405972 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-886cc84d4-qdcjh"] Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.381122 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-crwms"] Dec 08 09:24:35 crc kubenswrapper[4776]: E1208 09:24:35.383074 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ad8a82-0a3f-4f21-bf0f-a158bd903618" containerName="heat-cfnapi" Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.383092 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ad8a82-0a3f-4f21-bf0f-a158bd903618" containerName="heat-cfnapi" Dec 08 09:24:35 crc kubenswrapper[4776]: E1208 09:24:35.385837 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99fb0c4-9b67-42b9-87d6-13ae72903740" containerName="heat-api" Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.385848 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99fb0c4-9b67-42b9-87d6-13ae72903740" containerName="heat-api" Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.386292 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="39ad8a82-0a3f-4f21-bf0f-a158bd903618" containerName="heat-cfnapi" Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.386322 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c99fb0c4-9b67-42b9-87d6-13ae72903740" containerName="heat-api" Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.389928 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.411711 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-crwms"] Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.448017 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa1aa167-e177-4edb-8ab4-0d164d10c143-catalog-content\") pod \"redhat-marketplace-crwms\" (UID: \"aa1aa167-e177-4edb-8ab4-0d164d10c143\") " pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.448273 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa1aa167-e177-4edb-8ab4-0d164d10c143-utilities\") pod \"redhat-marketplace-crwms\" (UID: \"aa1aa167-e177-4edb-8ab4-0d164d10c143\") " pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.448385 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckqj7\" (UniqueName: \"kubernetes.io/projected/aa1aa167-e177-4edb-8ab4-0d164d10c143-kube-api-access-ckqj7\") pod \"redhat-marketplace-crwms\" (UID: \"aa1aa167-e177-4edb-8ab4-0d164d10c143\") " pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.470820 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.543334 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-55647645f8-9xvpq"] Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.543597 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-55647645f8-9xvpq" podUID="0cea35b3-0412-490a-9d71-2c5e10e85c51" containerName="heat-engine" containerID="cri-o://25231c45a67c8f3c3d317e3772e895cd938a660b465520775f6d63b54d0600fd" gracePeriod=60 Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.550022 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa1aa167-e177-4edb-8ab4-0d164d10c143-utilities\") pod \"redhat-marketplace-crwms\" (UID: \"aa1aa167-e177-4edb-8ab4-0d164d10c143\") " pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.550133 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckqj7\" (UniqueName: \"kubernetes.io/projected/aa1aa167-e177-4edb-8ab4-0d164d10c143-kube-api-access-ckqj7\") pod \"redhat-marketplace-crwms\" (UID: \"aa1aa167-e177-4edb-8ab4-0d164d10c143\") " pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.550238 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa1aa167-e177-4edb-8ab4-0d164d10c143-catalog-content\") pod \"redhat-marketplace-crwms\" (UID: \"aa1aa167-e177-4edb-8ab4-0d164d10c143\") " pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.550654 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa1aa167-e177-4edb-8ab4-0d164d10c143-catalog-content\") pod \"redhat-marketplace-crwms\" (UID: \"aa1aa167-e177-4edb-8ab4-0d164d10c143\") " pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.550906 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa1aa167-e177-4edb-8ab4-0d164d10c143-utilities\") pod \"redhat-marketplace-crwms\" (UID: \"aa1aa167-e177-4edb-8ab4-0d164d10c143\") " pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.588035 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckqj7\" (UniqueName: \"kubernetes.io/projected/aa1aa167-e177-4edb-8ab4-0d164d10c143-kube-api-access-ckqj7\") pod \"redhat-marketplace-crwms\" (UID: \"aa1aa167-e177-4edb-8ab4-0d164d10c143\") " pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:35 crc kubenswrapper[4776]: I1208 09:24:35.713892 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.287304 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.313230 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.380611 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-combined-ca-bundle\") pod \"773f52b6-826d-4179-8777-96d795b10c5d\" (UID: \"773f52b6-826d-4179-8777-96d795b10c5d\") " Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.380757 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-config-data-custom\") pod \"773f52b6-826d-4179-8777-96d795b10c5d\" (UID: \"773f52b6-826d-4179-8777-96d795b10c5d\") " Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.380798 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-config-data-custom\") pod \"edc53406-6d30-4981-aa38-cd183ebf1b7d\" (UID: \"edc53406-6d30-4981-aa38-cd183ebf1b7d\") " Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.380841 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-combined-ca-bundle\") pod \"edc53406-6d30-4981-aa38-cd183ebf1b7d\" (UID: \"edc53406-6d30-4981-aa38-cd183ebf1b7d\") " Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.380990 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt5pf\" (UniqueName: \"kubernetes.io/projected/edc53406-6d30-4981-aa38-cd183ebf1b7d-kube-api-access-rt5pf\") pod \"edc53406-6d30-4981-aa38-cd183ebf1b7d\" (UID: \"edc53406-6d30-4981-aa38-cd183ebf1b7d\") " Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.381088 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-config-data\") pod \"773f52b6-826d-4179-8777-96d795b10c5d\" (UID: \"773f52b6-826d-4179-8777-96d795b10c5d\") " Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.381129 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92bgk\" (UniqueName: \"kubernetes.io/projected/773f52b6-826d-4179-8777-96d795b10c5d-kube-api-access-92bgk\") pod \"773f52b6-826d-4179-8777-96d795b10c5d\" (UID: \"773f52b6-826d-4179-8777-96d795b10c5d\") " Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.381154 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-config-data\") pod \"edc53406-6d30-4981-aa38-cd183ebf1b7d\" (UID: \"edc53406-6d30-4981-aa38-cd183ebf1b7d\") " Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.404514 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/773f52b6-826d-4179-8777-96d795b10c5d-kube-api-access-92bgk" (OuterVolumeSpecName: "kube-api-access-92bgk") pod "773f52b6-826d-4179-8777-96d795b10c5d" (UID: "773f52b6-826d-4179-8777-96d795b10c5d"). InnerVolumeSpecName "kube-api-access-92bgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.419930 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "773f52b6-826d-4179-8777-96d795b10c5d" (UID: "773f52b6-826d-4179-8777-96d795b10c5d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.420540 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "edc53406-6d30-4981-aa38-cd183ebf1b7d" (UID: "edc53406-6d30-4981-aa38-cd183ebf1b7d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.423621 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edc53406-6d30-4981-aa38-cd183ebf1b7d-kube-api-access-rt5pf" (OuterVolumeSpecName: "kube-api-access-rt5pf") pod "edc53406-6d30-4981-aa38-cd183ebf1b7d" (UID: "edc53406-6d30-4981-aa38-cd183ebf1b7d"). InnerVolumeSpecName "kube-api-access-rt5pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.487698 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.487759 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.487772 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt5pf\" (UniqueName: \"kubernetes.io/projected/edc53406-6d30-4981-aa38-cd183ebf1b7d-kube-api-access-rt5pf\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.487783 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92bgk\" (UniqueName: \"kubernetes.io/projected/773f52b6-826d-4179-8777-96d795b10c5d-kube-api-access-92bgk\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.506371 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-config-data" (OuterVolumeSpecName: "config-data") pod "773f52b6-826d-4179-8777-96d795b10c5d" (UID: "773f52b6-826d-4179-8777-96d795b10c5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.565383 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edc53406-6d30-4981-aa38-cd183ebf1b7d" (UID: "edc53406-6d30-4981-aa38-cd183ebf1b7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.580370 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "773f52b6-826d-4179-8777-96d795b10c5d" (UID: "773f52b6-826d-4179-8777-96d795b10c5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.592763 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.592788 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773f52b6-826d-4179-8777-96d795b10c5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.592799 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.672605 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-crwms"] Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.690313 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-config-data" (OuterVolumeSpecName: "config-data") pod "edc53406-6d30-4981-aa38-cd183ebf1b7d" (UID: "edc53406-6d30-4981-aa38-cd183ebf1b7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.699198 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc53406-6d30-4981-aa38-cd183ebf1b7d-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.699646 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfea15ec-7a58-4283-831a-fdfb2c03918c","Type":"ContainerStarted","Data":"bfdbcca55a723020a0a48fb1fe9ea3cb82c4c7fba82729b0932cf82c1e82cdab"} Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.699825 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="ceilometer-central-agent" containerID="cri-o://b5fc4f7297ced84673c6a0f935b668a11655f90f300d50c9876bdf1e8b9217b3" gracePeriod=30 Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.700080 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.700412 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="proxy-httpd" containerID="cri-o://bfdbcca55a723020a0a48fb1fe9ea3cb82c4c7fba82729b0932cf82c1e82cdab" gracePeriod=30 Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.700460 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="sg-core" containerID="cri-o://a9fa0769ba2de6cc958985479f709ad8854021d01491c46212649412b74c487a" gracePeriod=30 Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.700491 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="ceilometer-notification-agent" containerID="cri-o://d148e2dc18f91d85414476dee2f5a9fc857b69cbe2d63e25d9a31d47ba2264da" gracePeriod=30 Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.732142 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-886cc84d4-qdcjh" event={"ID":"773f52b6-826d-4179-8777-96d795b10c5d","Type":"ContainerDied","Data":"86ab0714f014f83b10dbce52721676a6817ee1acb4b610c35795629d6a2001ec"} Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.732211 4776 scope.go:117] "RemoveContainer" containerID="b3d4237d2a4255a817185daec7a7b389fc523fd448f635c8acf86fe4c83e9927" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.732356 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-886cc84d4-qdcjh" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.752747 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.292685133 podStartE2EDuration="12.752732819s" podCreationTimestamp="2025-12-08 09:24:24 +0000 UTC" firstStartedPulling="2025-12-08 09:24:25.719293788 +0000 UTC m=+1541.982518810" lastFinishedPulling="2025-12-08 09:24:36.179341474 +0000 UTC m=+1552.442566496" observedRunningTime="2025-12-08 09:24:36.734597942 +0000 UTC m=+1552.997822964" watchObservedRunningTime="2025-12-08 09:24:36.752732819 +0000 UTC m=+1553.015957841" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.768416 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-785fc66866-ns7kd" event={"ID":"edc53406-6d30-4981-aa38-cd183ebf1b7d","Type":"ContainerDied","Data":"aa242ccfedd7c7b9408244e973e1a5cba13591ecb21bd9bb73275ca86c10063b"} Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.768537 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-785fc66866-ns7kd" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.798835 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zw8wj" event={"ID":"c2c03694-357a-4838-8202-7e3d3196f9ca","Type":"ContainerStarted","Data":"d1862aea8ebe37a2abba47d91654e83e009dcecd2a5485b55265dd485cc62a2f"} Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.815082 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crwms" event={"ID":"aa1aa167-e177-4edb-8ab4-0d164d10c143","Type":"ContainerStarted","Data":"1b654ce9b72a5efb69c7a029d7f4c86fc9a96034be149da1af54b4827ab0def8"} Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.832878 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-zw8wj" podStartSLOduration=2.358876335 podStartE2EDuration="15.832790466s" podCreationTimestamp="2025-12-08 09:24:21 +0000 UTC" firstStartedPulling="2025-12-08 09:24:22.69264 +0000 UTC m=+1538.955865022" lastFinishedPulling="2025-12-08 09:24:36.166554131 +0000 UTC m=+1552.429779153" observedRunningTime="2025-12-08 09:24:36.822710316 +0000 UTC m=+1553.085935338" watchObservedRunningTime="2025-12-08 09:24:36.832790466 +0000 UTC m=+1553.096015478" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.933524 4776 scope.go:117] "RemoveContainer" containerID="2caff5be80429340ce3fa5f5cad2f9ee571a68d4d8f2e2b0c94ae42de7b116b9" Dec 08 09:24:36 crc kubenswrapper[4776]: I1208 09:24:36.994744 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-886cc84d4-qdcjh"] Dec 08 09:24:37 crc kubenswrapper[4776]: I1208 09:24:37.055766 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-886cc84d4-qdcjh"] Dec 08 09:24:37 crc kubenswrapper[4776]: I1208 09:24:37.055881 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-785fc66866-ns7kd"] Dec 08 09:24:37 crc kubenswrapper[4776]: I1208 09:24:37.071720 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-785fc66866-ns7kd"] Dec 08 09:24:37 crc kubenswrapper[4776]: E1208 09:24:37.530503 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="25231c45a67c8f3c3d317e3772e895cd938a660b465520775f6d63b54d0600fd" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 08 09:24:37 crc kubenswrapper[4776]: E1208 09:24:37.531816 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="25231c45a67c8f3c3d317e3772e895cd938a660b465520775f6d63b54d0600fd" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 08 09:24:37 crc kubenswrapper[4776]: E1208 09:24:37.533167 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="25231c45a67c8f3c3d317e3772e895cd938a660b465520775f6d63b54d0600fd" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 08 09:24:37 crc kubenswrapper[4776]: E1208 09:24:37.533224 4776 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-55647645f8-9xvpq" podUID="0cea35b3-0412-490a-9d71-2c5e10e85c51" containerName="heat-engine" Dec 08 09:24:37 crc kubenswrapper[4776]: I1208 09:24:37.827234 4776 generic.go:334] "Generic (PLEG): container finished" podID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerID="a9fa0769ba2de6cc958985479f709ad8854021d01491c46212649412b74c487a" exitCode=2 Dec 08 09:24:37 crc kubenswrapper[4776]: I1208 09:24:37.827262 4776 generic.go:334] "Generic (PLEG): container finished" podID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerID="d148e2dc18f91d85414476dee2f5a9fc857b69cbe2d63e25d9a31d47ba2264da" exitCode=0 Dec 08 09:24:37 crc kubenswrapper[4776]: I1208 09:24:37.827315 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfea15ec-7a58-4283-831a-fdfb2c03918c","Type":"ContainerDied","Data":"a9fa0769ba2de6cc958985479f709ad8854021d01491c46212649412b74c487a"} Dec 08 09:24:37 crc kubenswrapper[4776]: I1208 09:24:37.827359 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfea15ec-7a58-4283-831a-fdfb2c03918c","Type":"ContainerDied","Data":"d148e2dc18f91d85414476dee2f5a9fc857b69cbe2d63e25d9a31d47ba2264da"} Dec 08 09:24:37 crc kubenswrapper[4776]: I1208 09:24:37.831301 4776 generic.go:334] "Generic (PLEG): container finished" podID="aa1aa167-e177-4edb-8ab4-0d164d10c143" containerID="d386a6c13df86a7be59007fb670de68a31e9bc4011365ea6a68223f23dafa9aa" exitCode=0 Dec 08 09:24:37 crc kubenswrapper[4776]: I1208 09:24:37.831537 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crwms" event={"ID":"aa1aa167-e177-4edb-8ab4-0d164d10c143","Type":"ContainerDied","Data":"d386a6c13df86a7be59007fb670de68a31e9bc4011365ea6a68223f23dafa9aa"} Dec 08 09:24:38 crc kubenswrapper[4776]: I1208 09:24:38.354959 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="773f52b6-826d-4179-8777-96d795b10c5d" path="/var/lib/kubelet/pods/773f52b6-826d-4179-8777-96d795b10c5d/volumes" Dec 08 09:24:38 crc kubenswrapper[4776]: I1208 09:24:38.355521 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edc53406-6d30-4981-aa38-cd183ebf1b7d" path="/var/lib/kubelet/pods/edc53406-6d30-4981-aa38-cd183ebf1b7d/volumes" Dec 08 09:24:39 crc kubenswrapper[4776]: I1208 09:24:39.859337 4776 generic.go:334] "Generic (PLEG): container finished" podID="aa1aa167-e177-4edb-8ab4-0d164d10c143" containerID="7e72a1a39f6c7cfec886b6d241f81a27d8038985e40bcec06b18ac547b3cefef" exitCode=0 Dec 08 09:24:39 crc kubenswrapper[4776]: I1208 09:24:39.859656 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crwms" event={"ID":"aa1aa167-e177-4edb-8ab4-0d164d10c143","Type":"ContainerDied","Data":"7e72a1a39f6c7cfec886b6d241f81a27d8038985e40bcec06b18ac547b3cefef"} Dec 08 09:24:40 crc kubenswrapper[4776]: I1208 09:24:40.869764 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crwms" event={"ID":"aa1aa167-e177-4edb-8ab4-0d164d10c143","Type":"ContainerStarted","Data":"4941ca9a84aaf6bd42e77c9c945331cdf95cc785f02f63b7d7780704ae78b304"} Dec 08 09:24:40 crc kubenswrapper[4776]: I1208 09:24:40.871789 4776 generic.go:334] "Generic (PLEG): container finished" podID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerID="b5fc4f7297ced84673c6a0f935b668a11655f90f300d50c9876bdf1e8b9217b3" exitCode=0 Dec 08 09:24:40 crc kubenswrapper[4776]: I1208 09:24:40.871826 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfea15ec-7a58-4283-831a-fdfb2c03918c","Type":"ContainerDied","Data":"b5fc4f7297ced84673c6a0f935b668a11655f90f300d50c9876bdf1e8b9217b3"} Dec 08 09:24:40 crc kubenswrapper[4776]: I1208 09:24:40.897567 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-crwms" podStartSLOduration=3.450243514 podStartE2EDuration="5.897548318s" podCreationTimestamp="2025-12-08 09:24:35 +0000 UTC" firstStartedPulling="2025-12-08 09:24:37.832804968 +0000 UTC m=+1554.096029990" lastFinishedPulling="2025-12-08 09:24:40.280109772 +0000 UTC m=+1556.543334794" observedRunningTime="2025-12-08 09:24:40.887674874 +0000 UTC m=+1557.150899896" watchObservedRunningTime="2025-12-08 09:24:40.897548318 +0000 UTC m=+1557.160773340" Dec 08 09:24:41 crc kubenswrapper[4776]: I1208 09:24:41.399447 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:24:41 crc kubenswrapper[4776]: I1208 09:24:41.399815 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:24:44 crc kubenswrapper[4776]: I1208 09:24:44.806045 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:44 crc kubenswrapper[4776]: I1208 09:24:44.914097 4776 generic.go:334] "Generic (PLEG): container finished" podID="0cea35b3-0412-490a-9d71-2c5e10e85c51" containerID="25231c45a67c8f3c3d317e3772e895cd938a660b465520775f6d63b54d0600fd" exitCode=0 Dec 08 09:24:44 crc kubenswrapper[4776]: I1208 09:24:44.914140 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-55647645f8-9xvpq" event={"ID":"0cea35b3-0412-490a-9d71-2c5e10e85c51","Type":"ContainerDied","Data":"25231c45a67c8f3c3d317e3772e895cd938a660b465520775f6d63b54d0600fd"} Dec 08 09:24:44 crc kubenswrapper[4776]: I1208 09:24:44.914221 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-55647645f8-9xvpq" event={"ID":"0cea35b3-0412-490a-9d71-2c5e10e85c51","Type":"ContainerDied","Data":"911edfa85a60dbb496cc30806f6261f958fa9aea33c46967c8deb243782a981c"} Dec 08 09:24:44 crc kubenswrapper[4776]: I1208 09:24:44.914219 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-55647645f8-9xvpq" Dec 08 09:24:44 crc kubenswrapper[4776]: I1208 09:24:44.914237 4776 scope.go:117] "RemoveContainer" containerID="25231c45a67c8f3c3d317e3772e895cd938a660b465520775f6d63b54d0600fd" Dec 08 09:24:44 crc kubenswrapper[4776]: I1208 09:24:44.948948 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-combined-ca-bundle\") pod \"0cea35b3-0412-490a-9d71-2c5e10e85c51\" (UID: \"0cea35b3-0412-490a-9d71-2c5e10e85c51\") " Dec 08 09:24:44 crc kubenswrapper[4776]: I1208 09:24:44.949191 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b4jh\" (UniqueName: \"kubernetes.io/projected/0cea35b3-0412-490a-9d71-2c5e10e85c51-kube-api-access-5b4jh\") pod \"0cea35b3-0412-490a-9d71-2c5e10e85c51\" (UID: \"0cea35b3-0412-490a-9d71-2c5e10e85c51\") " Dec 08 09:24:44 crc kubenswrapper[4776]: I1208 09:24:44.949272 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-config-data-custom\") pod \"0cea35b3-0412-490a-9d71-2c5e10e85c51\" (UID: \"0cea35b3-0412-490a-9d71-2c5e10e85c51\") " Dec 08 09:24:44 crc kubenswrapper[4776]: I1208 09:24:44.949328 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-config-data\") pod \"0cea35b3-0412-490a-9d71-2c5e10e85c51\" (UID: \"0cea35b3-0412-490a-9d71-2c5e10e85c51\") " Dec 08 09:24:44 crc kubenswrapper[4776]: I1208 09:24:44.950086 4776 scope.go:117] "RemoveContainer" containerID="25231c45a67c8f3c3d317e3772e895cd938a660b465520775f6d63b54d0600fd" Dec 08 09:24:44 crc kubenswrapper[4776]: E1208 09:24:44.951729 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25231c45a67c8f3c3d317e3772e895cd938a660b465520775f6d63b54d0600fd\": container with ID starting with 25231c45a67c8f3c3d317e3772e895cd938a660b465520775f6d63b54d0600fd not found: ID does not exist" containerID="25231c45a67c8f3c3d317e3772e895cd938a660b465520775f6d63b54d0600fd" Dec 08 09:24:44 crc kubenswrapper[4776]: I1208 09:24:44.951762 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25231c45a67c8f3c3d317e3772e895cd938a660b465520775f6d63b54d0600fd"} err="failed to get container status \"25231c45a67c8f3c3d317e3772e895cd938a660b465520775f6d63b54d0600fd\": rpc error: code = NotFound desc = could not find container \"25231c45a67c8f3c3d317e3772e895cd938a660b465520775f6d63b54d0600fd\": container with ID starting with 25231c45a67c8f3c3d317e3772e895cd938a660b465520775f6d63b54d0600fd not found: ID does not exist" Dec 08 09:24:44 crc kubenswrapper[4776]: I1208 09:24:44.955213 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cea35b3-0412-490a-9d71-2c5e10e85c51-kube-api-access-5b4jh" (OuterVolumeSpecName: "kube-api-access-5b4jh") pod "0cea35b3-0412-490a-9d71-2c5e10e85c51" (UID: "0cea35b3-0412-490a-9d71-2c5e10e85c51"). InnerVolumeSpecName "kube-api-access-5b4jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:44 crc kubenswrapper[4776]: I1208 09:24:44.957342 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0cea35b3-0412-490a-9d71-2c5e10e85c51" (UID: "0cea35b3-0412-490a-9d71-2c5e10e85c51"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:45 crc kubenswrapper[4776]: I1208 09:24:45.007051 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cea35b3-0412-490a-9d71-2c5e10e85c51" (UID: "0cea35b3-0412-490a-9d71-2c5e10e85c51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:45 crc kubenswrapper[4776]: I1208 09:24:45.028667 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-config-data" (OuterVolumeSpecName: "config-data") pod "0cea35b3-0412-490a-9d71-2c5e10e85c51" (UID: "0cea35b3-0412-490a-9d71-2c5e10e85c51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:45 crc kubenswrapper[4776]: I1208 09:24:45.052128 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b4jh\" (UniqueName: \"kubernetes.io/projected/0cea35b3-0412-490a-9d71-2c5e10e85c51-kube-api-access-5b4jh\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:45 crc kubenswrapper[4776]: I1208 09:24:45.052166 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:45 crc kubenswrapper[4776]: I1208 09:24:45.052200 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:45 crc kubenswrapper[4776]: I1208 09:24:45.052212 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cea35b3-0412-490a-9d71-2c5e10e85c51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:45 crc kubenswrapper[4776]: E1208 09:24:45.229588 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cea35b3_0412_490a_9d71_2c5e10e85c51.slice/crio-911edfa85a60dbb496cc30806f6261f958fa9aea33c46967c8deb243782a981c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cea35b3_0412_490a_9d71_2c5e10e85c51.slice\": RecentStats: unable to find data in memory cache]" Dec 08 09:24:45 crc kubenswrapper[4776]: I1208 09:24:45.260713 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-55647645f8-9xvpq"] Dec 08 09:24:45 crc kubenswrapper[4776]: I1208 09:24:45.272458 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-55647645f8-9xvpq"] Dec 08 09:24:45 crc kubenswrapper[4776]: I1208 09:24:45.715740 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:45 crc kubenswrapper[4776]: I1208 09:24:45.715800 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:45 crc kubenswrapper[4776]: I1208 09:24:45.784304 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:45 crc kubenswrapper[4776]: I1208 09:24:45.973844 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:46 crc kubenswrapper[4776]: I1208 09:24:46.050450 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-crwms"] Dec 08 09:24:46 crc kubenswrapper[4776]: I1208 09:24:46.355347 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cea35b3-0412-490a-9d71-2c5e10e85c51" path="/var/lib/kubelet/pods/0cea35b3-0412-490a-9d71-2c5e10e85c51/volumes" Dec 08 09:24:47 crc kubenswrapper[4776]: I1208 09:24:47.954575 4776 generic.go:334] "Generic (PLEG): container finished" podID="c2c03694-357a-4838-8202-7e3d3196f9ca" containerID="d1862aea8ebe37a2abba47d91654e83e009dcecd2a5485b55265dd485cc62a2f" exitCode=0 Dec 08 09:24:47 crc kubenswrapper[4776]: I1208 09:24:47.954722 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zw8wj" event={"ID":"c2c03694-357a-4838-8202-7e3d3196f9ca","Type":"ContainerDied","Data":"d1862aea8ebe37a2abba47d91654e83e009dcecd2a5485b55265dd485cc62a2f"} Dec 08 09:24:47 crc kubenswrapper[4776]: I1208 09:24:47.955183 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-crwms" podUID="aa1aa167-e177-4edb-8ab4-0d164d10c143" containerName="registry-server" containerID="cri-o://4941ca9a84aaf6bd42e77c9c945331cdf95cc785f02f63b7d7780704ae78b304" gracePeriod=2 Dec 08 09:24:48 crc kubenswrapper[4776]: I1208 09:24:48.515731 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:48 crc kubenswrapper[4776]: I1208 09:24:48.539647 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa1aa167-e177-4edb-8ab4-0d164d10c143-catalog-content\") pod \"aa1aa167-e177-4edb-8ab4-0d164d10c143\" (UID: \"aa1aa167-e177-4edb-8ab4-0d164d10c143\") " Dec 08 09:24:48 crc kubenswrapper[4776]: I1208 09:24:48.539908 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckqj7\" (UniqueName: \"kubernetes.io/projected/aa1aa167-e177-4edb-8ab4-0d164d10c143-kube-api-access-ckqj7\") pod \"aa1aa167-e177-4edb-8ab4-0d164d10c143\" (UID: \"aa1aa167-e177-4edb-8ab4-0d164d10c143\") " Dec 08 09:24:48 crc kubenswrapper[4776]: I1208 09:24:48.539951 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa1aa167-e177-4edb-8ab4-0d164d10c143-utilities\") pod \"aa1aa167-e177-4edb-8ab4-0d164d10c143\" (UID: \"aa1aa167-e177-4edb-8ab4-0d164d10c143\") " Dec 08 09:24:48 crc kubenswrapper[4776]: I1208 09:24:48.540845 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa1aa167-e177-4edb-8ab4-0d164d10c143-utilities" (OuterVolumeSpecName: "utilities") pod "aa1aa167-e177-4edb-8ab4-0d164d10c143" (UID: "aa1aa167-e177-4edb-8ab4-0d164d10c143"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:24:48 crc kubenswrapper[4776]: I1208 09:24:48.552106 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa1aa167-e177-4edb-8ab4-0d164d10c143-kube-api-access-ckqj7" (OuterVolumeSpecName: "kube-api-access-ckqj7") pod "aa1aa167-e177-4edb-8ab4-0d164d10c143" (UID: "aa1aa167-e177-4edb-8ab4-0d164d10c143"). InnerVolumeSpecName "kube-api-access-ckqj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:48 crc kubenswrapper[4776]: I1208 09:24:48.561988 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa1aa167-e177-4edb-8ab4-0d164d10c143-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa1aa167-e177-4edb-8ab4-0d164d10c143" (UID: "aa1aa167-e177-4edb-8ab4-0d164d10c143"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:24:48 crc kubenswrapper[4776]: I1208 09:24:48.644522 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckqj7\" (UniqueName: \"kubernetes.io/projected/aa1aa167-e177-4edb-8ab4-0d164d10c143-kube-api-access-ckqj7\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:48 crc kubenswrapper[4776]: I1208 09:24:48.644575 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa1aa167-e177-4edb-8ab4-0d164d10c143-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:48 crc kubenswrapper[4776]: I1208 09:24:48.644587 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa1aa167-e177-4edb-8ab4-0d164d10c143-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:48 crc kubenswrapper[4776]: I1208 09:24:48.975793 4776 generic.go:334] "Generic (PLEG): container finished" podID="aa1aa167-e177-4edb-8ab4-0d164d10c143" containerID="4941ca9a84aaf6bd42e77c9c945331cdf95cc785f02f63b7d7780704ae78b304" exitCode=0 Dec 08 09:24:48 crc kubenswrapper[4776]: I1208 09:24:48.977895 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-crwms" Dec 08 09:24:48 crc kubenswrapper[4776]: I1208 09:24:48.978080 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crwms" event={"ID":"aa1aa167-e177-4edb-8ab4-0d164d10c143","Type":"ContainerDied","Data":"4941ca9a84aaf6bd42e77c9c945331cdf95cc785f02f63b7d7780704ae78b304"} Dec 08 09:24:48 crc kubenswrapper[4776]: I1208 09:24:48.978125 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crwms" event={"ID":"aa1aa167-e177-4edb-8ab4-0d164d10c143","Type":"ContainerDied","Data":"1b654ce9b72a5efb69c7a029d7f4c86fc9a96034be149da1af54b4827ab0def8"} Dec 08 09:24:48 crc kubenswrapper[4776]: I1208 09:24:48.978145 4776 scope.go:117] "RemoveContainer" containerID="4941ca9a84aaf6bd42e77c9c945331cdf95cc785f02f63b7d7780704ae78b304" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.031941 4776 scope.go:117] "RemoveContainer" containerID="7e72a1a39f6c7cfec886b6d241f81a27d8038985e40bcec06b18ac547b3cefef" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.036242 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-crwms"] Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.045930 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-crwms"] Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.060364 4776 scope.go:117] "RemoveContainer" containerID="d386a6c13df86a7be59007fb670de68a31e9bc4011365ea6a68223f23dafa9aa" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.120956 4776 scope.go:117] "RemoveContainer" containerID="4941ca9a84aaf6bd42e77c9c945331cdf95cc785f02f63b7d7780704ae78b304" Dec 08 09:24:49 crc kubenswrapper[4776]: E1208 09:24:49.121496 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4941ca9a84aaf6bd42e77c9c945331cdf95cc785f02f63b7d7780704ae78b304\": container with ID starting with 4941ca9a84aaf6bd42e77c9c945331cdf95cc785f02f63b7d7780704ae78b304 not found: ID does not exist" containerID="4941ca9a84aaf6bd42e77c9c945331cdf95cc785f02f63b7d7780704ae78b304" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.121535 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4941ca9a84aaf6bd42e77c9c945331cdf95cc785f02f63b7d7780704ae78b304"} err="failed to get container status \"4941ca9a84aaf6bd42e77c9c945331cdf95cc785f02f63b7d7780704ae78b304\": rpc error: code = NotFound desc = could not find container \"4941ca9a84aaf6bd42e77c9c945331cdf95cc785f02f63b7d7780704ae78b304\": container with ID starting with 4941ca9a84aaf6bd42e77c9c945331cdf95cc785f02f63b7d7780704ae78b304 not found: ID does not exist" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.121562 4776 scope.go:117] "RemoveContainer" containerID="7e72a1a39f6c7cfec886b6d241f81a27d8038985e40bcec06b18ac547b3cefef" Dec 08 09:24:49 crc kubenswrapper[4776]: E1208 09:24:49.121876 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e72a1a39f6c7cfec886b6d241f81a27d8038985e40bcec06b18ac547b3cefef\": container with ID starting with 7e72a1a39f6c7cfec886b6d241f81a27d8038985e40bcec06b18ac547b3cefef not found: ID does not exist" containerID="7e72a1a39f6c7cfec886b6d241f81a27d8038985e40bcec06b18ac547b3cefef" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.121903 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e72a1a39f6c7cfec886b6d241f81a27d8038985e40bcec06b18ac547b3cefef"} err="failed to get container status \"7e72a1a39f6c7cfec886b6d241f81a27d8038985e40bcec06b18ac547b3cefef\": rpc error: code = NotFound desc = could not find container \"7e72a1a39f6c7cfec886b6d241f81a27d8038985e40bcec06b18ac547b3cefef\": container with ID starting with 7e72a1a39f6c7cfec886b6d241f81a27d8038985e40bcec06b18ac547b3cefef not found: ID does not exist" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.121919 4776 scope.go:117] "RemoveContainer" containerID="d386a6c13df86a7be59007fb670de68a31e9bc4011365ea6a68223f23dafa9aa" Dec 08 09:24:49 crc kubenswrapper[4776]: E1208 09:24:49.122161 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d386a6c13df86a7be59007fb670de68a31e9bc4011365ea6a68223f23dafa9aa\": container with ID starting with d386a6c13df86a7be59007fb670de68a31e9bc4011365ea6a68223f23dafa9aa not found: ID does not exist" containerID="d386a6c13df86a7be59007fb670de68a31e9bc4011365ea6a68223f23dafa9aa" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.122208 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d386a6c13df86a7be59007fb670de68a31e9bc4011365ea6a68223f23dafa9aa"} err="failed to get container status \"d386a6c13df86a7be59007fb670de68a31e9bc4011365ea6a68223f23dafa9aa\": rpc error: code = NotFound desc = could not find container \"d386a6c13df86a7be59007fb670de68a31e9bc4011365ea6a68223f23dafa9aa\": container with ID starting with d386a6c13df86a7be59007fb670de68a31e9bc4011365ea6a68223f23dafa9aa not found: ID does not exist" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.410317 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zw8wj" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.563706 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-config-data\") pod \"c2c03694-357a-4838-8202-7e3d3196f9ca\" (UID: \"c2c03694-357a-4838-8202-7e3d3196f9ca\") " Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.563856 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-scripts\") pod \"c2c03694-357a-4838-8202-7e3d3196f9ca\" (UID: \"c2c03694-357a-4838-8202-7e3d3196f9ca\") " Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.564030 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7tnb\" (UniqueName: \"kubernetes.io/projected/c2c03694-357a-4838-8202-7e3d3196f9ca-kube-api-access-d7tnb\") pod \"c2c03694-357a-4838-8202-7e3d3196f9ca\" (UID: \"c2c03694-357a-4838-8202-7e3d3196f9ca\") " Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.564148 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-combined-ca-bundle\") pod \"c2c03694-357a-4838-8202-7e3d3196f9ca\" (UID: \"c2c03694-357a-4838-8202-7e3d3196f9ca\") " Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.575373 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-scripts" (OuterVolumeSpecName: "scripts") pod "c2c03694-357a-4838-8202-7e3d3196f9ca" (UID: "c2c03694-357a-4838-8202-7e3d3196f9ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.587499 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2c03694-357a-4838-8202-7e3d3196f9ca-kube-api-access-d7tnb" (OuterVolumeSpecName: "kube-api-access-d7tnb") pod "c2c03694-357a-4838-8202-7e3d3196f9ca" (UID: "c2c03694-357a-4838-8202-7e3d3196f9ca"). InnerVolumeSpecName "kube-api-access-d7tnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.617302 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-config-data" (OuterVolumeSpecName: "config-data") pod "c2c03694-357a-4838-8202-7e3d3196f9ca" (UID: "c2c03694-357a-4838-8202-7e3d3196f9ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.666029 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.666056 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.666067 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7tnb\" (UniqueName: \"kubernetes.io/projected/c2c03694-357a-4838-8202-7e3d3196f9ca-kube-api-access-d7tnb\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.675369 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2c03694-357a-4838-8202-7e3d3196f9ca" (UID: "c2c03694-357a-4838-8202-7e3d3196f9ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.768344 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c03694-357a-4838-8202-7e3d3196f9ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.989278 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zw8wj" Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.989300 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zw8wj" event={"ID":"c2c03694-357a-4838-8202-7e3d3196f9ca","Type":"ContainerDied","Data":"1c5ae13ab7404f4aefec5858b674d29b544e7449cee1b009f71982d1e5089ee3"} Dec 08 09:24:49 crc kubenswrapper[4776]: I1208 09:24:49.989694 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c5ae13ab7404f4aefec5858b674d29b544e7449cee1b009f71982d1e5089ee3" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.105018 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 08 09:24:50 crc kubenswrapper[4776]: E1208 09:24:50.105562 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="773f52b6-826d-4179-8777-96d795b10c5d" containerName="heat-api" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.105583 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="773f52b6-826d-4179-8777-96d795b10c5d" containerName="heat-api" Dec 08 09:24:50 crc kubenswrapper[4776]: E1208 09:24:50.105603 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc53406-6d30-4981-aa38-cd183ebf1b7d" containerName="heat-cfnapi" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.105610 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc53406-6d30-4981-aa38-cd183ebf1b7d" containerName="heat-cfnapi" Dec 08 09:24:50 crc kubenswrapper[4776]: E1208 09:24:50.105623 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="773f52b6-826d-4179-8777-96d795b10c5d" containerName="heat-api" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.105629 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="773f52b6-826d-4179-8777-96d795b10c5d" containerName="heat-api" Dec 08 09:24:50 crc kubenswrapper[4776]: E1208 09:24:50.105639 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa1aa167-e177-4edb-8ab4-0d164d10c143" containerName="extract-content" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.105645 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa1aa167-e177-4edb-8ab4-0d164d10c143" containerName="extract-content" Dec 08 09:24:50 crc kubenswrapper[4776]: E1208 09:24:50.105675 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cea35b3-0412-490a-9d71-2c5e10e85c51" containerName="heat-engine" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.105682 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cea35b3-0412-490a-9d71-2c5e10e85c51" containerName="heat-engine" Dec 08 09:24:50 crc kubenswrapper[4776]: E1208 09:24:50.105706 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c03694-357a-4838-8202-7e3d3196f9ca" containerName="nova-cell0-conductor-db-sync" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.105712 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c03694-357a-4838-8202-7e3d3196f9ca" containerName="nova-cell0-conductor-db-sync" Dec 08 09:24:50 crc kubenswrapper[4776]: E1208 09:24:50.105724 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa1aa167-e177-4edb-8ab4-0d164d10c143" containerName="registry-server" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.105730 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa1aa167-e177-4edb-8ab4-0d164d10c143" containerName="registry-server" Dec 08 09:24:50 crc kubenswrapper[4776]: E1208 09:24:50.105751 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa1aa167-e177-4edb-8ab4-0d164d10c143" containerName="extract-utilities" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.105757 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa1aa167-e177-4edb-8ab4-0d164d10c143" containerName="extract-utilities" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.105951 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="773f52b6-826d-4179-8777-96d795b10c5d" containerName="heat-api" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.105965 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2c03694-357a-4838-8202-7e3d3196f9ca" containerName="nova-cell0-conductor-db-sync" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.105975 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cea35b3-0412-490a-9d71-2c5e10e85c51" containerName="heat-engine" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.105994 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa1aa167-e177-4edb-8ab4-0d164d10c143" containerName="registry-server" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.106010 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc53406-6d30-4981-aa38-cd183ebf1b7d" containerName="heat-cfnapi" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.106020 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc53406-6d30-4981-aa38-cd183ebf1b7d" containerName="heat-cfnapi" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.106868 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.110633 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-frkls" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.111116 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.126058 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.277755 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47f87\" (UniqueName: \"kubernetes.io/projected/ffbcb3b3-c4d6-461f-bae8-c1ae2de20050-kube-api-access-47f87\") pod \"nova-cell0-conductor-0\" (UID: \"ffbcb3b3-c4d6-461f-bae8-c1ae2de20050\") " pod="openstack/nova-cell0-conductor-0" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.277828 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffbcb3b3-c4d6-461f-bae8-c1ae2de20050-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ffbcb3b3-c4d6-461f-bae8-c1ae2de20050\") " pod="openstack/nova-cell0-conductor-0" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.277996 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffbcb3b3-c4d6-461f-bae8-c1ae2de20050-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ffbcb3b3-c4d6-461f-bae8-c1ae2de20050\") " pod="openstack/nova-cell0-conductor-0" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.357352 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa1aa167-e177-4edb-8ab4-0d164d10c143" path="/var/lib/kubelet/pods/aa1aa167-e177-4edb-8ab4-0d164d10c143/volumes" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.379741 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffbcb3b3-c4d6-461f-bae8-c1ae2de20050-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ffbcb3b3-c4d6-461f-bae8-c1ae2de20050\") " pod="openstack/nova-cell0-conductor-0" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.379922 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffbcb3b3-c4d6-461f-bae8-c1ae2de20050-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ffbcb3b3-c4d6-461f-bae8-c1ae2de20050\") " pod="openstack/nova-cell0-conductor-0" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.379996 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47f87\" (UniqueName: \"kubernetes.io/projected/ffbcb3b3-c4d6-461f-bae8-c1ae2de20050-kube-api-access-47f87\") pod \"nova-cell0-conductor-0\" (UID: \"ffbcb3b3-c4d6-461f-bae8-c1ae2de20050\") " pod="openstack/nova-cell0-conductor-0" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.384816 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffbcb3b3-c4d6-461f-bae8-c1ae2de20050-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ffbcb3b3-c4d6-461f-bae8-c1ae2de20050\") " pod="openstack/nova-cell0-conductor-0" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.387161 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffbcb3b3-c4d6-461f-bae8-c1ae2de20050-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ffbcb3b3-c4d6-461f-bae8-c1ae2de20050\") " pod="openstack/nova-cell0-conductor-0" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.401722 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47f87\" (UniqueName: \"kubernetes.io/projected/ffbcb3b3-c4d6-461f-bae8-c1ae2de20050-kube-api-access-47f87\") pod \"nova-cell0-conductor-0\" (UID: \"ffbcb3b3-c4d6-461f-bae8-c1ae2de20050\") " pod="openstack/nova-cell0-conductor-0" Dec 08 09:24:50 crc kubenswrapper[4776]: I1208 09:24:50.491310 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 08 09:24:51 crc kubenswrapper[4776]: I1208 09:24:51.003823 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 08 09:24:51 crc kubenswrapper[4776]: W1208 09:24:51.006317 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffbcb3b3_c4d6_461f_bae8_c1ae2de20050.slice/crio-fd46012d0dd1916170ad369630b20388a91d243f592f522ee543ce046c6b57da WatchSource:0}: Error finding container fd46012d0dd1916170ad369630b20388a91d243f592f522ee543ce046c6b57da: Status 404 returned error can't find the container with id fd46012d0dd1916170ad369630b20388a91d243f592f522ee543ce046c6b57da Dec 08 09:24:52 crc kubenswrapper[4776]: I1208 09:24:52.031713 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ffbcb3b3-c4d6-461f-bae8-c1ae2de20050","Type":"ContainerStarted","Data":"fc10c274a532d74e1170ee3e13f54a82c6851672d91c6ee238faa31d522d7bf8"} Dec 08 09:24:52 crc kubenswrapper[4776]: I1208 09:24:52.032066 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ffbcb3b3-c4d6-461f-bae8-c1ae2de20050","Type":"ContainerStarted","Data":"fd46012d0dd1916170ad369630b20388a91d243f592f522ee543ce046c6b57da"} Dec 08 09:24:52 crc kubenswrapper[4776]: I1208 09:24:52.032121 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 08 09:24:52 crc kubenswrapper[4776]: I1208 09:24:52.064979 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.064955895 podStartE2EDuration="2.064955895s" podCreationTimestamp="2025-12-08 09:24:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:24:52.053353893 +0000 UTC m=+1568.316578925" watchObservedRunningTime="2025-12-08 09:24:52.064955895 +0000 UTC m=+1568.328180927" Dec 08 09:24:54 crc kubenswrapper[4776]: I1208 09:24:54.917168 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.675292 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9qcrg"] Dec 08 09:24:59 crc kubenswrapper[4776]: E1208 09:24:59.676195 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc53406-6d30-4981-aa38-cd183ebf1b7d" containerName="heat-cfnapi" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.676206 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc53406-6d30-4981-aa38-cd183ebf1b7d" containerName="heat-cfnapi" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.676431 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="773f52b6-826d-4179-8777-96d795b10c5d" containerName="heat-api" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.686936 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.696836 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qcrg"] Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.714716 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/effd4b52-0471-4d81-b2bf-9b46ac73db66-catalog-content\") pod \"certified-operators-9qcrg\" (UID: \"effd4b52-0471-4d81-b2bf-9b46ac73db66\") " pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.714953 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hnrj\" (UniqueName: \"kubernetes.io/projected/effd4b52-0471-4d81-b2bf-9b46ac73db66-kube-api-access-4hnrj\") pod \"certified-operators-9qcrg\" (UID: \"effd4b52-0471-4d81-b2bf-9b46ac73db66\") " pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.715019 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/effd4b52-0471-4d81-b2bf-9b46ac73db66-utilities\") pod \"certified-operators-9qcrg\" (UID: \"effd4b52-0471-4d81-b2bf-9b46ac73db66\") " pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.781425 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-b033-account-create-update-dzl9x"] Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.783433 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-b033-account-create-update-dzl9x" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.785432 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.793559 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-qcnxq"] Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.795090 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-qcnxq" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.806343 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-b033-account-create-update-dzl9x"] Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.818000 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0bfa985-0dfe-42bb-95ea-2e40830c7a23-operator-scripts\") pod \"aodh-db-create-qcnxq\" (UID: \"b0bfa985-0dfe-42bb-95ea-2e40830c7a23\") " pod="openstack/aodh-db-create-qcnxq" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.818111 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hnrj\" (UniqueName: \"kubernetes.io/projected/effd4b52-0471-4d81-b2bf-9b46ac73db66-kube-api-access-4hnrj\") pod \"certified-operators-9qcrg\" (UID: \"effd4b52-0471-4d81-b2bf-9b46ac73db66\") " pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.818277 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cmrn\" (UniqueName: \"kubernetes.io/projected/fa674a56-b583-453e-9c66-e8ff93895b50-kube-api-access-9cmrn\") pod \"aodh-b033-account-create-update-dzl9x\" (UID: \"fa674a56-b583-453e-9c66-e8ff93895b50\") " pod="openstack/aodh-b033-account-create-update-dzl9x" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.818316 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/effd4b52-0471-4d81-b2bf-9b46ac73db66-utilities\") pod \"certified-operators-9qcrg\" (UID: \"effd4b52-0471-4d81-b2bf-9b46ac73db66\") " pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.818427 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa674a56-b583-453e-9c66-e8ff93895b50-operator-scripts\") pod \"aodh-b033-account-create-update-dzl9x\" (UID: \"fa674a56-b583-453e-9c66-e8ff93895b50\") " pod="openstack/aodh-b033-account-create-update-dzl9x" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.818485 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/effd4b52-0471-4d81-b2bf-9b46ac73db66-catalog-content\") pod \"certified-operators-9qcrg\" (UID: \"effd4b52-0471-4d81-b2bf-9b46ac73db66\") " pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.818677 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsc6k\" (UniqueName: \"kubernetes.io/projected/b0bfa985-0dfe-42bb-95ea-2e40830c7a23-kube-api-access-bsc6k\") pod \"aodh-db-create-qcnxq\" (UID: \"b0bfa985-0dfe-42bb-95ea-2e40830c7a23\") " pod="openstack/aodh-db-create-qcnxq" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.819129 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/effd4b52-0471-4d81-b2bf-9b46ac73db66-utilities\") pod \"certified-operators-9qcrg\" (UID: \"effd4b52-0471-4d81-b2bf-9b46ac73db66\") " pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.819189 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-qcnxq"] Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.819255 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/effd4b52-0471-4d81-b2bf-9b46ac73db66-catalog-content\") pod \"certified-operators-9qcrg\" (UID: \"effd4b52-0471-4d81-b2bf-9b46ac73db66\") " pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.839066 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hnrj\" (UniqueName: \"kubernetes.io/projected/effd4b52-0471-4d81-b2bf-9b46ac73db66-kube-api-access-4hnrj\") pod \"certified-operators-9qcrg\" (UID: \"effd4b52-0471-4d81-b2bf-9b46ac73db66\") " pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.919685 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cmrn\" (UniqueName: \"kubernetes.io/projected/fa674a56-b583-453e-9c66-e8ff93895b50-kube-api-access-9cmrn\") pod \"aodh-b033-account-create-update-dzl9x\" (UID: \"fa674a56-b583-453e-9c66-e8ff93895b50\") " pod="openstack/aodh-b033-account-create-update-dzl9x" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.919789 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa674a56-b583-453e-9c66-e8ff93895b50-operator-scripts\") pod \"aodh-b033-account-create-update-dzl9x\" (UID: \"fa674a56-b583-453e-9c66-e8ff93895b50\") " pod="openstack/aodh-b033-account-create-update-dzl9x" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.919880 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsc6k\" (UniqueName: \"kubernetes.io/projected/b0bfa985-0dfe-42bb-95ea-2e40830c7a23-kube-api-access-bsc6k\") pod \"aodh-db-create-qcnxq\" (UID: \"b0bfa985-0dfe-42bb-95ea-2e40830c7a23\") " pod="openstack/aodh-db-create-qcnxq" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.919906 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0bfa985-0dfe-42bb-95ea-2e40830c7a23-operator-scripts\") pod \"aodh-db-create-qcnxq\" (UID: \"b0bfa985-0dfe-42bb-95ea-2e40830c7a23\") " pod="openstack/aodh-db-create-qcnxq" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.920641 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0bfa985-0dfe-42bb-95ea-2e40830c7a23-operator-scripts\") pod \"aodh-db-create-qcnxq\" (UID: \"b0bfa985-0dfe-42bb-95ea-2e40830c7a23\") " pod="openstack/aodh-db-create-qcnxq" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.920920 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa674a56-b583-453e-9c66-e8ff93895b50-operator-scripts\") pod \"aodh-b033-account-create-update-dzl9x\" (UID: \"fa674a56-b583-453e-9c66-e8ff93895b50\") " pod="openstack/aodh-b033-account-create-update-dzl9x" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.936228 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cmrn\" (UniqueName: \"kubernetes.io/projected/fa674a56-b583-453e-9c66-e8ff93895b50-kube-api-access-9cmrn\") pod \"aodh-b033-account-create-update-dzl9x\" (UID: \"fa674a56-b583-453e-9c66-e8ff93895b50\") " pod="openstack/aodh-b033-account-create-update-dzl9x" Dec 08 09:24:59 crc kubenswrapper[4776]: I1208 09:24:59.939325 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsc6k\" (UniqueName: \"kubernetes.io/projected/b0bfa985-0dfe-42bb-95ea-2e40830c7a23-kube-api-access-bsc6k\") pod \"aodh-db-create-qcnxq\" (UID: \"b0bfa985-0dfe-42bb-95ea-2e40830c7a23\") " pod="openstack/aodh-db-create-qcnxq" Dec 08 09:25:00 crc kubenswrapper[4776]: I1208 09:25:00.017626 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:25:00 crc kubenswrapper[4776]: I1208 09:25:00.104807 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-b033-account-create-update-dzl9x" Dec 08 09:25:00 crc kubenswrapper[4776]: I1208 09:25:00.126451 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-qcnxq" Dec 08 09:25:00 crc kubenswrapper[4776]: I1208 09:25:00.591934 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 08 09:25:00 crc kubenswrapper[4776]: I1208 09:25:00.733805 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qcrg"] Dec 08 09:25:00 crc kubenswrapper[4776]: I1208 09:25:00.892159 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-qcnxq"] Dec 08 09:25:00 crc kubenswrapper[4776]: W1208 09:25:00.895722 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0bfa985_0dfe_42bb_95ea_2e40830c7a23.slice/crio-4c7baf615ea5f24572be2f1e6cbb0a2bfe026cca747de754fd2c9e0e37e65b70 WatchSource:0}: Error finding container 4c7baf615ea5f24572be2f1e6cbb0a2bfe026cca747de754fd2c9e0e37e65b70: Status 404 returned error can't find the container with id 4c7baf615ea5f24572be2f1e6cbb0a2bfe026cca747de754fd2c9e0e37e65b70 Dec 08 09:25:00 crc kubenswrapper[4776]: W1208 09:25:00.900185 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa674a56_b583_453e_9c66_e8ff93895b50.slice/crio-abe2e990357a6f1a7faaebc913d076d26d2c4146942a636e72398bdf0ca855e2 WatchSource:0}: Error finding container abe2e990357a6f1a7faaebc913d076d26d2c4146942a636e72398bdf0ca855e2: Status 404 returned error can't find the container with id abe2e990357a6f1a7faaebc913d076d26d2c4146942a636e72398bdf0ca855e2 Dec 08 09:25:00 crc kubenswrapper[4776]: I1208 09:25:00.904130 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-b033-account-create-update-dzl9x"] Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.181992 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-b033-account-create-update-dzl9x" event={"ID":"fa674a56-b583-453e-9c66-e8ff93895b50","Type":"ContainerStarted","Data":"abe2e990357a6f1a7faaebc913d076d26d2c4146942a636e72398bdf0ca855e2"} Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.186781 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qcrg" event={"ID":"effd4b52-0471-4d81-b2bf-9b46ac73db66","Type":"ContainerStarted","Data":"bbcda11ac2edecd998914d0452359b729cd9a725fc82d588eb86ed2742b9ff5d"} Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.186836 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qcrg" event={"ID":"effd4b52-0471-4d81-b2bf-9b46ac73db66","Type":"ContainerStarted","Data":"433413c18af34192251de4926e27f251afe9b6bed6df2810f35ac9d10843aedd"} Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.189266 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-qcnxq" event={"ID":"b0bfa985-0dfe-42bb-95ea-2e40830c7a23","Type":"ContainerStarted","Data":"4c7baf615ea5f24572be2f1e6cbb0a2bfe026cca747de754fd2c9e0e37e65b70"} Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.357218 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-v54bb"] Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.358651 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v54bb" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.360662 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.362766 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.373922 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-v54bb"] Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.469502 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxs2c\" (UniqueName: \"kubernetes.io/projected/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-kube-api-access-cxs2c\") pod \"nova-cell0-cell-mapping-v54bb\" (UID: \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\") " pod="openstack/nova-cell0-cell-mapping-v54bb" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.469565 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-scripts\") pod \"nova-cell0-cell-mapping-v54bb\" (UID: \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\") " pod="openstack/nova-cell0-cell-mapping-v54bb" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.469617 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-v54bb\" (UID: \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\") " pod="openstack/nova-cell0-cell-mapping-v54bb" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.469736 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-config-data\") pod \"nova-cell0-cell-mapping-v54bb\" (UID: \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\") " pod="openstack/nova-cell0-cell-mapping-v54bb" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.553372 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.555281 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.557647 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.563455 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.571498 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxs2c\" (UniqueName: \"kubernetes.io/projected/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-kube-api-access-cxs2c\") pod \"nova-cell0-cell-mapping-v54bb\" (UID: \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\") " pod="openstack/nova-cell0-cell-mapping-v54bb" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.571551 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-scripts\") pod \"nova-cell0-cell-mapping-v54bb\" (UID: \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\") " pod="openstack/nova-cell0-cell-mapping-v54bb" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.571600 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-v54bb\" (UID: \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\") " pod="openstack/nova-cell0-cell-mapping-v54bb" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.571720 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-config-data\") pod \"nova-cell0-cell-mapping-v54bb\" (UID: \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\") " pod="openstack/nova-cell0-cell-mapping-v54bb" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.579392 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-scripts\") pod \"nova-cell0-cell-mapping-v54bb\" (UID: \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\") " pod="openstack/nova-cell0-cell-mapping-v54bb" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.587791 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-config-data\") pod \"nova-cell0-cell-mapping-v54bb\" (UID: \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\") " pod="openstack/nova-cell0-cell-mapping-v54bb" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.587948 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-v54bb\" (UID: \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\") " pod="openstack/nova-cell0-cell-mapping-v54bb" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.613235 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.614774 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.629540 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.645853 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxs2c\" (UniqueName: \"kubernetes.io/projected/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-kube-api-access-cxs2c\") pod \"nova-cell0-cell-mapping-v54bb\" (UID: \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\") " pod="openstack/nova-cell0-cell-mapping-v54bb" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.682029 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.683996 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxn7t\" (UniqueName: \"kubernetes.io/projected/5d325803-a91c-4c5a-8b77-999224ba963d-kube-api-access-wxn7t\") pod \"nova-api-0\" (UID: \"5d325803-a91c-4c5a-8b77-999224ba963d\") " pod="openstack/nova-api-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.684032 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75d3ac7a-8205-4462-8c9d-83029a4deeaf-config-data\") pod \"nova-scheduler-0\" (UID: \"75d3ac7a-8205-4462-8c9d-83029a4deeaf\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.684071 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d3ac7a-8205-4462-8c9d-83029a4deeaf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"75d3ac7a-8205-4462-8c9d-83029a4deeaf\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.684123 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-847z2\" (UniqueName: \"kubernetes.io/projected/75d3ac7a-8205-4462-8c9d-83029a4deeaf-kube-api-access-847z2\") pod \"nova-scheduler-0\" (UID: \"75d3ac7a-8205-4462-8c9d-83029a4deeaf\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.684236 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d325803-a91c-4c5a-8b77-999224ba963d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5d325803-a91c-4c5a-8b77-999224ba963d\") " pod="openstack/nova-api-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.684253 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d325803-a91c-4c5a-8b77-999224ba963d-logs\") pod \"nova-api-0\" (UID: \"5d325803-a91c-4c5a-8b77-999224ba963d\") " pod="openstack/nova-api-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.684273 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d325803-a91c-4c5a-8b77-999224ba963d-config-data\") pod \"nova-api-0\" (UID: \"5d325803-a91c-4c5a-8b77-999224ba963d\") " pod="openstack/nova-api-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.684524 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v54bb" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.716543 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.719192 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.724214 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.741116 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.792607 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-847z2\" (UniqueName: \"kubernetes.io/projected/75d3ac7a-8205-4462-8c9d-83029a4deeaf-kube-api-access-847z2\") pod \"nova-scheduler-0\" (UID: \"75d3ac7a-8205-4462-8c9d-83029a4deeaf\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.792730 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d325803-a91c-4c5a-8b77-999224ba963d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5d325803-a91c-4c5a-8b77-999224ba963d\") " pod="openstack/nova-api-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.792748 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d325803-a91c-4c5a-8b77-999224ba963d-logs\") pod \"nova-api-0\" (UID: \"5d325803-a91c-4c5a-8b77-999224ba963d\") " pod="openstack/nova-api-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.792769 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d325803-a91c-4c5a-8b77-999224ba963d-config-data\") pod \"nova-api-0\" (UID: \"5d325803-a91c-4c5a-8b77-999224ba963d\") " pod="openstack/nova-api-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.792844 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxn7t\" (UniqueName: \"kubernetes.io/projected/5d325803-a91c-4c5a-8b77-999224ba963d-kube-api-access-wxn7t\") pod \"nova-api-0\" (UID: \"5d325803-a91c-4c5a-8b77-999224ba963d\") " pod="openstack/nova-api-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.792865 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75d3ac7a-8205-4462-8c9d-83029a4deeaf-config-data\") pod \"nova-scheduler-0\" (UID: \"75d3ac7a-8205-4462-8c9d-83029a4deeaf\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.792900 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d3ac7a-8205-4462-8c9d-83029a4deeaf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"75d3ac7a-8205-4462-8c9d-83029a4deeaf\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.793189 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d325803-a91c-4c5a-8b77-999224ba963d-logs\") pod \"nova-api-0\" (UID: \"5d325803-a91c-4c5a-8b77-999224ba963d\") " pod="openstack/nova-api-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.795305 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.796896 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.797494 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d325803-a91c-4c5a-8b77-999224ba963d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5d325803-a91c-4c5a-8b77-999224ba963d\") " pod="openstack/nova-api-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.798499 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d325803-a91c-4c5a-8b77-999224ba963d-config-data\") pod \"nova-api-0\" (UID: \"5d325803-a91c-4c5a-8b77-999224ba963d\") " pod="openstack/nova-api-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.803825 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.804327 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75d3ac7a-8205-4462-8c9d-83029a4deeaf-config-data\") pod \"nova-scheduler-0\" (UID: \"75d3ac7a-8205-4462-8c9d-83029a4deeaf\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.804548 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d3ac7a-8205-4462-8c9d-83029a4deeaf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"75d3ac7a-8205-4462-8c9d-83029a4deeaf\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.818814 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-847z2\" (UniqueName: \"kubernetes.io/projected/75d3ac7a-8205-4462-8c9d-83029a4deeaf-kube-api-access-847z2\") pod \"nova-scheduler-0\" (UID: \"75d3ac7a-8205-4462-8c9d-83029a4deeaf\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.828248 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxn7t\" (UniqueName: \"kubernetes.io/projected/5d325803-a91c-4c5a-8b77-999224ba963d-kube-api-access-wxn7t\") pod \"nova-api-0\" (UID: \"5d325803-a91c-4c5a-8b77-999224ba963d\") " pod="openstack/nova-api-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.840246 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.874784 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-hzfv9"] Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.877690 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.888597 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-hzfv9"] Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.895149 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.949096 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e5b39ee-eda8-48e5-b374-b1330cbb7b08\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.949431 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jzcm\" (UniqueName: \"kubernetes.io/projected/162bfd33-a716-4c84-8639-f5047819367f-kube-api-access-7jzcm\") pod \"nova-metadata-0\" (UID: \"162bfd33-a716-4c84-8639-f5047819367f\") " pod="openstack/nova-metadata-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.949584 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgsfs\" (UniqueName: \"kubernetes.io/projected/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-kube-api-access-jgsfs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e5b39ee-eda8-48e5-b374-b1330cbb7b08\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.949626 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162bfd33-a716-4c84-8639-f5047819367f-config-data\") pod \"nova-metadata-0\" (UID: \"162bfd33-a716-4c84-8639-f5047819367f\") " pod="openstack/nova-metadata-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.949699 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162bfd33-a716-4c84-8639-f5047819367f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"162bfd33-a716-4c84-8639-f5047819367f\") " pod="openstack/nova-metadata-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.949845 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/162bfd33-a716-4c84-8639-f5047819367f-logs\") pod \"nova-metadata-0\" (UID: \"162bfd33-a716-4c84-8639-f5047819367f\") " pod="openstack/nova-metadata-0" Dec 08 09:25:01 crc kubenswrapper[4776]: I1208 09:25:01.949876 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e5b39ee-eda8-48e5-b374-b1330cbb7b08\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.022995 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.052241 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk7px\" (UniqueName: \"kubernetes.io/projected/474cb911-9e81-43ed-a828-52d9f03eb4df-kube-api-access-fk7px\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.052295 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.052342 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-config\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.052388 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.052409 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-dns-svc\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.052440 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jzcm\" (UniqueName: \"kubernetes.io/projected/162bfd33-a716-4c84-8639-f5047819367f-kube-api-access-7jzcm\") pod \"nova-metadata-0\" (UID: \"162bfd33-a716-4c84-8639-f5047819367f\") " pod="openstack/nova-metadata-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.052469 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.052527 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgsfs\" (UniqueName: \"kubernetes.io/projected/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-kube-api-access-jgsfs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e5b39ee-eda8-48e5-b374-b1330cbb7b08\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.052561 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162bfd33-a716-4c84-8639-f5047819367f-config-data\") pod \"nova-metadata-0\" (UID: \"162bfd33-a716-4c84-8639-f5047819367f\") " pod="openstack/nova-metadata-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.052598 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162bfd33-a716-4c84-8639-f5047819367f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"162bfd33-a716-4c84-8639-f5047819367f\") " pod="openstack/nova-metadata-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.052658 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/162bfd33-a716-4c84-8639-f5047819367f-logs\") pod \"nova-metadata-0\" (UID: \"162bfd33-a716-4c84-8639-f5047819367f\") " pod="openstack/nova-metadata-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.052677 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e5b39ee-eda8-48e5-b374-b1330cbb7b08\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.052702 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e5b39ee-eda8-48e5-b374-b1330cbb7b08\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.053516 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/162bfd33-a716-4c84-8639-f5047819367f-logs\") pod \"nova-metadata-0\" (UID: \"162bfd33-a716-4c84-8639-f5047819367f\") " pod="openstack/nova-metadata-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.064004 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e5b39ee-eda8-48e5-b374-b1330cbb7b08\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.064591 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162bfd33-a716-4c84-8639-f5047819367f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"162bfd33-a716-4c84-8639-f5047819367f\") " pod="openstack/nova-metadata-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.092067 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162bfd33-a716-4c84-8639-f5047819367f-config-data\") pod \"nova-metadata-0\" (UID: \"162bfd33-a716-4c84-8639-f5047819367f\") " pod="openstack/nova-metadata-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.092462 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jzcm\" (UniqueName: \"kubernetes.io/projected/162bfd33-a716-4c84-8639-f5047819367f-kube-api-access-7jzcm\") pod \"nova-metadata-0\" (UID: \"162bfd33-a716-4c84-8639-f5047819367f\") " pod="openstack/nova-metadata-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.093794 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgsfs\" (UniqueName: \"kubernetes.io/projected/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-kube-api-access-jgsfs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e5b39ee-eda8-48e5-b374-b1330cbb7b08\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.095264 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e5b39ee-eda8-48e5-b374-b1330cbb7b08\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.154382 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.154438 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-dns-svc\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.154476 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.154628 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk7px\" (UniqueName: \"kubernetes.io/projected/474cb911-9e81-43ed-a828-52d9f03eb4df-kube-api-access-fk7px\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.154653 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.154691 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-config\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.155689 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-config\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.156479 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.156795 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.157142 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.157324 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-dns-svc\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.191490 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk7px\" (UniqueName: \"kubernetes.io/projected/474cb911-9e81-43ed-a828-52d9f03eb4df-kube-api-access-fk7px\") pod \"dnsmasq-dns-9b86998b5-hzfv9\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.215374 4776 generic.go:334] "Generic (PLEG): container finished" podID="fa674a56-b583-453e-9c66-e8ff93895b50" containerID="35ec05672b585df9de15c49b45532bbf2ada66b7ec860aa22ae4af9d93e7c753" exitCode=0 Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.215528 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-b033-account-create-update-dzl9x" event={"ID":"fa674a56-b583-453e-9c66-e8ff93895b50","Type":"ContainerDied","Data":"35ec05672b585df9de15c49b45532bbf2ada66b7ec860aa22ae4af9d93e7c753"} Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.224613 4776 generic.go:334] "Generic (PLEG): container finished" podID="effd4b52-0471-4d81-b2bf-9b46ac73db66" containerID="bbcda11ac2edecd998914d0452359b729cd9a725fc82d588eb86ed2742b9ff5d" exitCode=0 Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.224697 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qcrg" event={"ID":"effd4b52-0471-4d81-b2bf-9b46ac73db66","Type":"ContainerDied","Data":"bbcda11ac2edecd998914d0452359b729cd9a725fc82d588eb86ed2742b9ff5d"} Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.235985 4776 generic.go:334] "Generic (PLEG): container finished" podID="b0bfa985-0dfe-42bb-95ea-2e40830c7a23" containerID="205e021d4497e71dfc63f8de58525faaed65ab904a2219d4ef0a157f7eb5b478" exitCode=0 Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.236027 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-qcnxq" event={"ID":"b0bfa985-0dfe-42bb-95ea-2e40830c7a23","Type":"ContainerDied","Data":"205e021d4497e71dfc63f8de58525faaed65ab904a2219d4ef0a157f7eb5b478"} Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.292108 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.308408 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.367783 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 09:25:02 crc kubenswrapper[4776]: I1208 09:25:02.517841 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-v54bb"] Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.031490 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.341754 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d325803-a91c-4c5a-8b77-999224ba963d","Type":"ContainerStarted","Data":"4a67a4350698ef1fd94d9fb7c93e5e36743ddf81df96a7c150c5b4cc23139efb"} Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.342502 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dl2hd"] Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.345759 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dl2hd" Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.361776 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.361998 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.363794 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dl2hd\" (UID: \"5756d118-f614-4000-82d2-ffa1623179cd\") " pod="openstack/nova-cell1-conductor-db-sync-dl2hd" Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.363936 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqm8k\" (UniqueName: \"kubernetes.io/projected/5756d118-f614-4000-82d2-ffa1623179cd-kube-api-access-zqm8k\") pod \"nova-cell1-conductor-db-sync-dl2hd\" (UID: \"5756d118-f614-4000-82d2-ffa1623179cd\") " pod="openstack/nova-cell1-conductor-db-sync-dl2hd" Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.364067 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-scripts\") pod \"nova-cell1-conductor-db-sync-dl2hd\" (UID: \"5756d118-f614-4000-82d2-ffa1623179cd\") " pod="openstack/nova-cell1-conductor-db-sync-dl2hd" Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.364096 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-config-data\") pod \"nova-cell1-conductor-db-sync-dl2hd\" (UID: \"5756d118-f614-4000-82d2-ffa1623179cd\") " pod="openstack/nova-cell1-conductor-db-sync-dl2hd" Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.368578 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-hzfv9"] Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.368622 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v54bb" event={"ID":"d88c7e2b-caa4-4d68-acc2-1483da2dfef3","Type":"ContainerStarted","Data":"e214b7cfb399e3befb67bad9702061a23721c8411131d72dfbc07162b8f4c215"} Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.375610 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qcrg" event={"ID":"effd4b52-0471-4d81-b2bf-9b46ac73db66","Type":"ContainerStarted","Data":"6f62cccd22294b109ae29089055740c58bddc6ebe1da4e0532ac21c2b37320a2"} Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.470814 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dl2hd\" (UID: \"5756d118-f614-4000-82d2-ffa1623179cd\") " pod="openstack/nova-cell1-conductor-db-sync-dl2hd" Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.470968 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqm8k\" (UniqueName: \"kubernetes.io/projected/5756d118-f614-4000-82d2-ffa1623179cd-kube-api-access-zqm8k\") pod \"nova-cell1-conductor-db-sync-dl2hd\" (UID: \"5756d118-f614-4000-82d2-ffa1623179cd\") " pod="openstack/nova-cell1-conductor-db-sync-dl2hd" Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.471017 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-scripts\") pod \"nova-cell1-conductor-db-sync-dl2hd\" (UID: \"5756d118-f614-4000-82d2-ffa1623179cd\") " pod="openstack/nova-cell1-conductor-db-sync-dl2hd" Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.471052 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-config-data\") pod \"nova-cell1-conductor-db-sync-dl2hd\" (UID: \"5756d118-f614-4000-82d2-ffa1623179cd\") " pod="openstack/nova-cell1-conductor-db-sync-dl2hd" Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.479996 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-scripts\") pod \"nova-cell1-conductor-db-sync-dl2hd\" (UID: \"5756d118-f614-4000-82d2-ffa1623179cd\") " pod="openstack/nova-cell1-conductor-db-sync-dl2hd" Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.481803 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-config-data\") pod \"nova-cell1-conductor-db-sync-dl2hd\" (UID: \"5756d118-f614-4000-82d2-ffa1623179cd\") " pod="openstack/nova-cell1-conductor-db-sync-dl2hd" Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.482546 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dl2hd\" (UID: \"5756d118-f614-4000-82d2-ffa1623179cd\") " pod="openstack/nova-cell1-conductor-db-sync-dl2hd" Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.502567 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqm8k\" (UniqueName: \"kubernetes.io/projected/5756d118-f614-4000-82d2-ffa1623179cd-kube-api-access-zqm8k\") pod \"nova-cell1-conductor-db-sync-dl2hd\" (UID: \"5756d118-f614-4000-82d2-ffa1623179cd\") " pod="openstack/nova-cell1-conductor-db-sync-dl2hd" Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.532368 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 09:25:03 crc kubenswrapper[4776]: W1208 09:25:03.541746 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75d3ac7a_8205_4462_8c9d_83029a4deeaf.slice/crio-64fd4ac1e5d1723769f232dd3ae513f476467080638c8c73c84af9c6ad013a9a WatchSource:0}: Error finding container 64fd4ac1e5d1723769f232dd3ae513f476467080638c8c73c84af9c6ad013a9a: Status 404 returned error can't find the container with id 64fd4ac1e5d1723769f232dd3ae513f476467080638c8c73c84af9c6ad013a9a Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.578576 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dl2hd" Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.589461 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dl2hd"] Dec 08 09:25:03 crc kubenswrapper[4776]: I1208 09:25:03.996090 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.008776 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.394101 4776 generic.go:334] "Generic (PLEG): container finished" podID="effd4b52-0471-4d81-b2bf-9b46ac73db66" containerID="6f62cccd22294b109ae29089055740c58bddc6ebe1da4e0532ac21c2b37320a2" exitCode=0 Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.402606 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-qcnxq" Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.403476 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-b033-account-create-update-dzl9x" event={"ID":"fa674a56-b583-453e-9c66-e8ff93895b50","Type":"ContainerDied","Data":"abe2e990357a6f1a7faaebc913d076d26d2c4146942a636e72398bdf0ca855e2"} Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.403583 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abe2e990357a6f1a7faaebc913d076d26d2c4146942a636e72398bdf0ca855e2" Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.403597 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"75d3ac7a-8205-4462-8c9d-83029a4deeaf","Type":"ContainerStarted","Data":"64fd4ac1e5d1723769f232dd3ae513f476467080638c8c73c84af9c6ad013a9a"} Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.403609 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qcrg" event={"ID":"effd4b52-0471-4d81-b2bf-9b46ac73db66","Type":"ContainerDied","Data":"6f62cccd22294b109ae29089055740c58bddc6ebe1da4e0532ac21c2b37320a2"} Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.406522 4776 generic.go:334] "Generic (PLEG): container finished" podID="474cb911-9e81-43ed-a828-52d9f03eb4df" containerID="09604120125d21ca38bf66a4b036ff07206fa8247d1f249d62a02d93c94ef6a7" exitCode=0 Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.406579 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" event={"ID":"474cb911-9e81-43ed-a828-52d9f03eb4df","Type":"ContainerDied","Data":"09604120125d21ca38bf66a4b036ff07206fa8247d1f249d62a02d93c94ef6a7"} Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.406597 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" event={"ID":"474cb911-9e81-43ed-a828-52d9f03eb4df","Type":"ContainerStarted","Data":"7fe3f2599b53a869df7e6c2dcb468e6f3ae530cb00400251edb047dc618c43c4"} Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.409761 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-qcnxq" event={"ID":"b0bfa985-0dfe-42bb-95ea-2e40830c7a23","Type":"ContainerDied","Data":"4c7baf615ea5f24572be2f1e6cbb0a2bfe026cca747de754fd2c9e0e37e65b70"} Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.409814 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c7baf615ea5f24572be2f1e6cbb0a2bfe026cca747de754fd2c9e0e37e65b70" Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.409855 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-qcnxq" Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.412542 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8e5b39ee-eda8-48e5-b374-b1330cbb7b08","Type":"ContainerStarted","Data":"7b64da380396f6648898561ca56790d4c958ecb39013a4d8ea56be5d835c9eee"} Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.413855 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-b033-account-create-update-dzl9x" Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.419266 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v54bb" event={"ID":"d88c7e2b-caa4-4d68-acc2-1483da2dfef3","Type":"ContainerStarted","Data":"bf7d40f85562b59617eff8f372ce1b5dc118aa06a87cd416ec83aa2c3a74f745"} Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.421908 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"162bfd33-a716-4c84-8639-f5047819367f","Type":"ContainerStarted","Data":"7fe31b7c7d737dc0a7adef72cbc22694e453c7a89a1396a5e15faed83641ec71"} Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.521307 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa674a56-b583-453e-9c66-e8ff93895b50-operator-scripts\") pod \"fa674a56-b583-453e-9c66-e8ff93895b50\" (UID: \"fa674a56-b583-453e-9c66-e8ff93895b50\") " Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.521494 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cmrn\" (UniqueName: \"kubernetes.io/projected/fa674a56-b583-453e-9c66-e8ff93895b50-kube-api-access-9cmrn\") pod \"fa674a56-b583-453e-9c66-e8ff93895b50\" (UID: \"fa674a56-b583-453e-9c66-e8ff93895b50\") " Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.521614 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsc6k\" (UniqueName: \"kubernetes.io/projected/b0bfa985-0dfe-42bb-95ea-2e40830c7a23-kube-api-access-bsc6k\") pod \"b0bfa985-0dfe-42bb-95ea-2e40830c7a23\" (UID: \"b0bfa985-0dfe-42bb-95ea-2e40830c7a23\") " Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.521754 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0bfa985-0dfe-42bb-95ea-2e40830c7a23-operator-scripts\") pod \"b0bfa985-0dfe-42bb-95ea-2e40830c7a23\" (UID: \"b0bfa985-0dfe-42bb-95ea-2e40830c7a23\") " Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.523782 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa674a56-b583-453e-9c66-e8ff93895b50-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fa674a56-b583-453e-9c66-e8ff93895b50" (UID: "fa674a56-b583-453e-9c66-e8ff93895b50"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.529411 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0bfa985-0dfe-42bb-95ea-2e40830c7a23-kube-api-access-bsc6k" (OuterVolumeSpecName: "kube-api-access-bsc6k") pod "b0bfa985-0dfe-42bb-95ea-2e40830c7a23" (UID: "b0bfa985-0dfe-42bb-95ea-2e40830c7a23"). InnerVolumeSpecName "kube-api-access-bsc6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.531783 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0bfa985-0dfe-42bb-95ea-2e40830c7a23-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0bfa985-0dfe-42bb-95ea-2e40830c7a23" (UID: "b0bfa985-0dfe-42bb-95ea-2e40830c7a23"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.539378 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa674a56-b583-453e-9c66-e8ff93895b50-kube-api-access-9cmrn" (OuterVolumeSpecName: "kube-api-access-9cmrn") pod "fa674a56-b583-453e-9c66-e8ff93895b50" (UID: "fa674a56-b583-453e-9c66-e8ff93895b50"). InnerVolumeSpecName "kube-api-access-9cmrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.590462 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dl2hd"] Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.624563 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0bfa985-0dfe-42bb-95ea-2e40830c7a23-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.624592 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa674a56-b583-453e-9c66-e8ff93895b50-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.624601 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cmrn\" (UniqueName: \"kubernetes.io/projected/fa674a56-b583-453e-9c66-e8ff93895b50-kube-api-access-9cmrn\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.624611 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsc6k\" (UniqueName: \"kubernetes.io/projected/b0bfa985-0dfe-42bb-95ea-2e40830c7a23-kube-api-access-bsc6k\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:04 crc kubenswrapper[4776]: I1208 09:25:04.638953 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-v54bb" podStartSLOduration=3.638934059 podStartE2EDuration="3.638934059s" podCreationTimestamp="2025-12-08 09:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:25:04.581626882 +0000 UTC m=+1580.844851904" watchObservedRunningTime="2025-12-08 09:25:04.638934059 +0000 UTC m=+1580.902159071" Dec 08 09:25:05 crc kubenswrapper[4776]: I1208 09:25:05.488525 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qcrg" event={"ID":"effd4b52-0471-4d81-b2bf-9b46ac73db66","Type":"ContainerStarted","Data":"2c7d535a175883577fb099ac9e330b9de9d6f7ff9934e8c8e4ea9fb4f3f2ed74"} Dec 08 09:25:05 crc kubenswrapper[4776]: I1208 09:25:05.500884 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" event={"ID":"474cb911-9e81-43ed-a828-52d9f03eb4df","Type":"ContainerStarted","Data":"25ec6d64e4d550aad188f5bf9c1c6906cd4de1837719de1cdc3e17d679d5b853"} Dec 08 09:25:05 crc kubenswrapper[4776]: I1208 09:25:05.501150 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:05 crc kubenswrapper[4776]: I1208 09:25:05.536118 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dl2hd" event={"ID":"5756d118-f614-4000-82d2-ffa1623179cd","Type":"ContainerStarted","Data":"254e6b53d20f5019a28759d11c91d73a6eaf8a3751091df3d4e8bb0cb0912d0c"} Dec 08 09:25:05 crc kubenswrapper[4776]: I1208 09:25:05.536161 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dl2hd" event={"ID":"5756d118-f614-4000-82d2-ffa1623179cd","Type":"ContainerStarted","Data":"42bd218007224e606a6431ae28b9825d4585d06eb4850f65f079508190da2edc"} Dec 08 09:25:05 crc kubenswrapper[4776]: I1208 09:25:05.536272 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-b033-account-create-update-dzl9x" Dec 08 09:25:05 crc kubenswrapper[4776]: I1208 09:25:05.539554 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9qcrg" podStartSLOduration=2.874165477 podStartE2EDuration="6.539536083s" podCreationTimestamp="2025-12-08 09:24:59 +0000 UTC" firstStartedPulling="2025-12-08 09:25:01.188606803 +0000 UTC m=+1577.451831825" lastFinishedPulling="2025-12-08 09:25:04.853977409 +0000 UTC m=+1581.117202431" observedRunningTime="2025-12-08 09:25:05.535970908 +0000 UTC m=+1581.799195930" watchObservedRunningTime="2025-12-08 09:25:05.539536083 +0000 UTC m=+1581.802761105" Dec 08 09:25:05 crc kubenswrapper[4776]: I1208 09:25:05.602836 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-dl2hd" podStartSLOduration=2.602814741 podStartE2EDuration="2.602814741s" podCreationTimestamp="2025-12-08 09:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:25:05.575144158 +0000 UTC m=+1581.838369180" watchObservedRunningTime="2025-12-08 09:25:05.602814741 +0000 UTC m=+1581.866039753" Dec 08 09:25:05 crc kubenswrapper[4776]: I1208 09:25:05.689516 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" podStartSLOduration=4.689495107 podStartE2EDuration="4.689495107s" podCreationTimestamp="2025-12-08 09:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:25:05.610634691 +0000 UTC m=+1581.873859713" watchObservedRunningTime="2025-12-08 09:25:05.689495107 +0000 UTC m=+1581.952720129" Dec 08 09:25:06 crc kubenswrapper[4776]: I1208 09:25:06.398495 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 09:25:06 crc kubenswrapper[4776]: I1208 09:25:06.428842 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:25:07 crc kubenswrapper[4776]: I1208 09:25:07.581906 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfea15ec-7a58-4283-831a-fdfb2c03918c","Type":"ContainerDied","Data":"bfdbcca55a723020a0a48fb1fe9ea3cb82c4c7fba82729b0932cf82c1e82cdab"} Dec 08 09:25:07 crc kubenswrapper[4776]: I1208 09:25:07.581266 4776 generic.go:334] "Generic (PLEG): container finished" podID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerID="bfdbcca55a723020a0a48fb1fe9ea3cb82c4c7fba82729b0932cf82c1e82cdab" exitCode=137 Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.234823 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.364101 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbl79\" (UniqueName: \"kubernetes.io/projected/cfea15ec-7a58-4283-831a-fdfb2c03918c-kube-api-access-rbl79\") pod \"cfea15ec-7a58-4283-831a-fdfb2c03918c\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.364714 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-sg-core-conf-yaml\") pod \"cfea15ec-7a58-4283-831a-fdfb2c03918c\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.364858 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfea15ec-7a58-4283-831a-fdfb2c03918c-run-httpd\") pod \"cfea15ec-7a58-4283-831a-fdfb2c03918c\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.365004 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-scripts\") pod \"cfea15ec-7a58-4283-831a-fdfb2c03918c\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.365163 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-config-data\") pod \"cfea15ec-7a58-4283-831a-fdfb2c03918c\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.365446 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-combined-ca-bundle\") pod \"cfea15ec-7a58-4283-831a-fdfb2c03918c\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.365566 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfea15ec-7a58-4283-831a-fdfb2c03918c-log-httpd\") pod \"cfea15ec-7a58-4283-831a-fdfb2c03918c\" (UID: \"cfea15ec-7a58-4283-831a-fdfb2c03918c\") " Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.366291 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfea15ec-7a58-4283-831a-fdfb2c03918c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cfea15ec-7a58-4283-831a-fdfb2c03918c" (UID: "cfea15ec-7a58-4283-831a-fdfb2c03918c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.366562 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfea15ec-7a58-4283-831a-fdfb2c03918c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cfea15ec-7a58-4283-831a-fdfb2c03918c" (UID: "cfea15ec-7a58-4283-831a-fdfb2c03918c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.377902 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-scripts" (OuterVolumeSpecName: "scripts") pod "cfea15ec-7a58-4283-831a-fdfb2c03918c" (UID: "cfea15ec-7a58-4283-831a-fdfb2c03918c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.382551 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfea15ec-7a58-4283-831a-fdfb2c03918c-kube-api-access-rbl79" (OuterVolumeSpecName: "kube-api-access-rbl79") pod "cfea15ec-7a58-4283-831a-fdfb2c03918c" (UID: "cfea15ec-7a58-4283-831a-fdfb2c03918c"). InnerVolumeSpecName "kube-api-access-rbl79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.467689 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.468054 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfea15ec-7a58-4283-831a-fdfb2c03918c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.468065 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbl79\" (UniqueName: \"kubernetes.io/projected/cfea15ec-7a58-4283-831a-fdfb2c03918c-kube-api-access-rbl79\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.468076 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfea15ec-7a58-4283-831a-fdfb2c03918c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.517424 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cfea15ec-7a58-4283-831a-fdfb2c03918c" (UID: "cfea15ec-7a58-4283-831a-fdfb2c03918c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.571690 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.606503 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-config-data" (OuterVolumeSpecName: "config-data") pod "cfea15ec-7a58-4283-831a-fdfb2c03918c" (UID: "cfea15ec-7a58-4283-831a-fdfb2c03918c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.625341 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfea15ec-7a58-4283-831a-fdfb2c03918c" (UID: "cfea15ec-7a58-4283-831a-fdfb2c03918c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.646670 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"75d3ac7a-8205-4462-8c9d-83029a4deeaf","Type":"ContainerStarted","Data":"e07d10db2d409eaf812277bc8d45b23e5dbf58cd173497c7d15fd5c3257f5361"} Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.653426 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d325803-a91c-4c5a-8b77-999224ba963d","Type":"ContainerStarted","Data":"4463d2c51be80f98e9cb0b963df9480af26dc0ac2b01e82a99c3cddf0f3f2f53"} Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.663799 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfea15ec-7a58-4283-831a-fdfb2c03918c","Type":"ContainerDied","Data":"c77951c107e18990ff33d002be46dcf0cd726f856d2efc98783f338e047f3ccc"} Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.663931 4776 scope.go:117] "RemoveContainer" containerID="bfdbcca55a723020a0a48fb1fe9ea3cb82c4c7fba82729b0932cf82c1e82cdab" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.664136 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.671983 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8e5b39ee-eda8-48e5-b374-b1330cbb7b08","Type":"ContainerStarted","Data":"62b751b12ee0868b33a91f533e59e54e9cb21df1538d1c662c61aeee0f49e77b"} Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.672120 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8e5b39ee-eda8-48e5-b374-b1330cbb7b08" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://62b751b12ee0868b33a91f533e59e54e9cb21df1538d1c662c61aeee0f49e77b" gracePeriod=30 Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.673848 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.673867 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfea15ec-7a58-4283-831a-fdfb2c03918c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.681765 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"162bfd33-a716-4c84-8639-f5047819367f","Type":"ContainerStarted","Data":"ba23870b505b0ef94be94d47cdf7c48d114cb504f16ecc4001bd71cb0cd5ecb0"} Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.693573 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.397328366 podStartE2EDuration="8.693554371s" podCreationTimestamp="2025-12-08 09:25:01 +0000 UTC" firstStartedPulling="2025-12-08 09:25:03.554468381 +0000 UTC m=+1579.817693403" lastFinishedPulling="2025-12-08 09:25:08.850694386 +0000 UTC m=+1585.113919408" observedRunningTime="2025-12-08 09:25:09.687475737 +0000 UTC m=+1585.950700749" watchObservedRunningTime="2025-12-08 09:25:09.693554371 +0000 UTC m=+1585.956779383" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.723294 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.888851764 podStartE2EDuration="8.723273688s" podCreationTimestamp="2025-12-08 09:25:01 +0000 UTC" firstStartedPulling="2025-12-08 09:25:04.016108438 +0000 UTC m=+1580.279333460" lastFinishedPulling="2025-12-08 09:25:08.850530362 +0000 UTC m=+1585.113755384" observedRunningTime="2025-12-08 09:25:09.710557027 +0000 UTC m=+1585.973782049" watchObservedRunningTime="2025-12-08 09:25:09.723273688 +0000 UTC m=+1585.986498700" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.749296 4776 scope.go:117] "RemoveContainer" containerID="a9fa0769ba2de6cc958985479f709ad8854021d01491c46212649412b74c487a" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.775549 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.789784 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.801879 4776 scope.go:117] "RemoveContainer" containerID="d148e2dc18f91d85414476dee2f5a9fc857b69cbe2d63e25d9a31d47ba2264da" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.811241 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:09 crc kubenswrapper[4776]: E1208 09:25:09.811816 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa674a56-b583-453e-9c66-e8ff93895b50" containerName="mariadb-account-create-update" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.811830 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa674a56-b583-453e-9c66-e8ff93895b50" containerName="mariadb-account-create-update" Dec 08 09:25:09 crc kubenswrapper[4776]: E1208 09:25:09.811847 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="proxy-httpd" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.811855 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="proxy-httpd" Dec 08 09:25:09 crc kubenswrapper[4776]: E1208 09:25:09.811891 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0bfa985-0dfe-42bb-95ea-2e40830c7a23" containerName="mariadb-database-create" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.811899 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0bfa985-0dfe-42bb-95ea-2e40830c7a23" containerName="mariadb-database-create" Dec 08 09:25:09 crc kubenswrapper[4776]: E1208 09:25:09.811923 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="ceilometer-central-agent" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.811930 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="ceilometer-central-agent" Dec 08 09:25:09 crc kubenswrapper[4776]: E1208 09:25:09.811946 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="ceilometer-notification-agent" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.811954 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="ceilometer-notification-agent" Dec 08 09:25:09 crc kubenswrapper[4776]: E1208 09:25:09.811976 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="sg-core" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.811984 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="sg-core" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.814489 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="ceilometer-notification-agent" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.814511 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa674a56-b583-453e-9c66-e8ff93895b50" containerName="mariadb-account-create-update" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.814520 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0bfa985-0dfe-42bb-95ea-2e40830c7a23" containerName="mariadb-database-create" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.814534 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="ceilometer-central-agent" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.814548 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="sg-core" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.814566 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" containerName="proxy-httpd" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.816616 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.834691 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.835090 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.838424 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.881089 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef52d5d-1c6a-4586-aa17-8c8253a53262-log-httpd\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.881150 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-config-data\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.881259 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef52d5d-1c6a-4586-aa17-8c8253a53262-run-httpd\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.881333 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.881378 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2k77\" (UniqueName: \"kubernetes.io/projected/eef52d5d-1c6a-4586-aa17-8c8253a53262-kube-api-access-h2k77\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.881402 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.881518 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-scripts\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.896885 4776 scope.go:117] "RemoveContainer" containerID="b5fc4f7297ced84673c6a0f935b668a11655f90f300d50c9876bdf1e8b9217b3" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.983960 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.984323 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2k77\" (UniqueName: \"kubernetes.io/projected/eef52d5d-1c6a-4586-aa17-8c8253a53262-kube-api-access-h2k77\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.984453 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.984484 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-scripts\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.984539 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef52d5d-1c6a-4586-aa17-8c8253a53262-log-httpd\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.984571 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-config-data\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.984625 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef52d5d-1c6a-4586-aa17-8c8253a53262-run-httpd\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.985115 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef52d5d-1c6a-4586-aa17-8c8253a53262-run-httpd\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.985471 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef52d5d-1c6a-4586-aa17-8c8253a53262-log-httpd\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.989482 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.998438 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-config-data\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:09 crc kubenswrapper[4776]: I1208 09:25:09.998564 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-scripts\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.000453 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.016657 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2k77\" (UniqueName: \"kubernetes.io/projected/eef52d5d-1c6a-4586-aa17-8c8253a53262-kube-api-access-h2k77\") pod \"ceilometer-0\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " pod="openstack/ceilometer-0" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.020353 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.020386 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.093225 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.164612 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.265655 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-47z2b"] Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.267275 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-47z2b" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.276279 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.276475 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.276574 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-rtn2h" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.276753 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.283501 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-47z2b"] Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.379943 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfea15ec-7a58-4283-831a-fdfb2c03918c" path="/var/lib/kubelet/pods/cfea15ec-7a58-4283-831a-fdfb2c03918c/volumes" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.395264 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-scripts\") pod \"aodh-db-sync-47z2b\" (UID: \"4f3c7425-49ed-4491-9422-4d50616e53c4\") " pod="openstack/aodh-db-sync-47z2b" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.395389 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-combined-ca-bundle\") pod \"aodh-db-sync-47z2b\" (UID: \"4f3c7425-49ed-4491-9422-4d50616e53c4\") " pod="openstack/aodh-db-sync-47z2b" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.395430 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsjqx\" (UniqueName: \"kubernetes.io/projected/4f3c7425-49ed-4491-9422-4d50616e53c4-kube-api-access-rsjqx\") pod \"aodh-db-sync-47z2b\" (UID: \"4f3c7425-49ed-4491-9422-4d50616e53c4\") " pod="openstack/aodh-db-sync-47z2b" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.395550 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-config-data\") pod \"aodh-db-sync-47z2b\" (UID: \"4f3c7425-49ed-4491-9422-4d50616e53c4\") " pod="openstack/aodh-db-sync-47z2b" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.522963 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsjqx\" (UniqueName: \"kubernetes.io/projected/4f3c7425-49ed-4491-9422-4d50616e53c4-kube-api-access-rsjqx\") pod \"aodh-db-sync-47z2b\" (UID: \"4f3c7425-49ed-4491-9422-4d50616e53c4\") " pod="openstack/aodh-db-sync-47z2b" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.523465 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-config-data\") pod \"aodh-db-sync-47z2b\" (UID: \"4f3c7425-49ed-4491-9422-4d50616e53c4\") " pod="openstack/aodh-db-sync-47z2b" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.523722 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-scripts\") pod \"aodh-db-sync-47z2b\" (UID: \"4f3c7425-49ed-4491-9422-4d50616e53c4\") " pod="openstack/aodh-db-sync-47z2b" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.523803 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-combined-ca-bundle\") pod \"aodh-db-sync-47z2b\" (UID: \"4f3c7425-49ed-4491-9422-4d50616e53c4\") " pod="openstack/aodh-db-sync-47z2b" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.536639 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-config-data\") pod \"aodh-db-sync-47z2b\" (UID: \"4f3c7425-49ed-4491-9422-4d50616e53c4\") " pod="openstack/aodh-db-sync-47z2b" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.543435 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-combined-ca-bundle\") pod \"aodh-db-sync-47z2b\" (UID: \"4f3c7425-49ed-4491-9422-4d50616e53c4\") " pod="openstack/aodh-db-sync-47z2b" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.558510 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-scripts\") pod \"aodh-db-sync-47z2b\" (UID: \"4f3c7425-49ed-4491-9422-4d50616e53c4\") " pod="openstack/aodh-db-sync-47z2b" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.601277 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsjqx\" (UniqueName: \"kubernetes.io/projected/4f3c7425-49ed-4491-9422-4d50616e53c4-kube-api-access-rsjqx\") pod \"aodh-db-sync-47z2b\" (UID: \"4f3c7425-49ed-4491-9422-4d50616e53c4\") " pod="openstack/aodh-db-sync-47z2b" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.636925 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-47z2b" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.700536 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"162bfd33-a716-4c84-8639-f5047819367f","Type":"ContainerStarted","Data":"90ed83760f8cdd165b83b7d2fba082abfdd5f35f3c31dbc8975f98844edb6cfc"} Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.700916 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="162bfd33-a716-4c84-8639-f5047819367f" containerName="nova-metadata-log" containerID="cri-o://ba23870b505b0ef94be94d47cdf7c48d114cb504f16ecc4001bd71cb0cd5ecb0" gracePeriod=30 Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.701455 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="162bfd33-a716-4c84-8639-f5047819367f" containerName="nova-metadata-metadata" containerID="cri-o://90ed83760f8cdd165b83b7d2fba082abfdd5f35f3c31dbc8975f98844edb6cfc" gracePeriod=30 Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.716145 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d325803-a91c-4c5a-8b77-999224ba963d","Type":"ContainerStarted","Data":"94d975c0078678b47fa426c56e073c14f30298ad22a675567a8f654285452af3"} Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.746969 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.984169164 podStartE2EDuration="9.746949965s" podCreationTimestamp="2025-12-08 09:25:01 +0000 UTC" firstStartedPulling="2025-12-08 09:25:04.093882715 +0000 UTC m=+1580.357107737" lastFinishedPulling="2025-12-08 09:25:08.856663516 +0000 UTC m=+1585.119888538" observedRunningTime="2025-12-08 09:25:10.725235972 +0000 UTC m=+1586.988460994" watchObservedRunningTime="2025-12-08 09:25:10.746949965 +0000 UTC m=+1587.010174987" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.776346 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.988042407 podStartE2EDuration="9.776330043s" podCreationTimestamp="2025-12-08 09:25:01 +0000 UTC" firstStartedPulling="2025-12-08 09:25:03.062638946 +0000 UTC m=+1579.325863968" lastFinishedPulling="2025-12-08 09:25:08.850926582 +0000 UTC m=+1585.114151604" observedRunningTime="2025-12-08 09:25:10.755086793 +0000 UTC m=+1587.018311815" watchObservedRunningTime="2025-12-08 09:25:10.776330043 +0000 UTC m=+1587.039555065" Dec 08 09:25:10 crc kubenswrapper[4776]: I1208 09:25:10.851431 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:10.935223 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:10.963825 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qcrg"] Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:11.398674 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:11.399002 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:11.399048 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:11.399788 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:11.399843 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" gracePeriod=600 Dec 08 09:25:11 crc kubenswrapper[4776]: E1208 09:25:11.552147 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:25:11 crc kubenswrapper[4776]: E1208 09:25:11.582064 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9788ab1_1031_4103_a769_a4b3177c7268.slice/crio-bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9788ab1_1031_4103_a769_a4b3177c7268.slice/crio-conmon-bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341.scope\": RecentStats: unable to find data in memory cache]" Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:11.837594 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" exitCode=0 Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:11.837711 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341"} Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:11.837750 4776 scope.go:117] "RemoveContainer" containerID="6a5351febb0de8fddebf4555b73007dffb77eb52f317fae03ed23b485a212557" Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:11.838446 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:25:11 crc kubenswrapper[4776]: E1208 09:25:11.838935 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:11.855987 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef52d5d-1c6a-4586-aa17-8c8253a53262","Type":"ContainerStarted","Data":"6a779dc9287df2c4cd300466349f4407466c657fabfa7776944bd0dfa0f2c662"} Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:11.866915 4776 generic.go:334] "Generic (PLEG): container finished" podID="162bfd33-a716-4c84-8639-f5047819367f" containerID="90ed83760f8cdd165b83b7d2fba082abfdd5f35f3c31dbc8975f98844edb6cfc" exitCode=0 Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:11.866948 4776 generic.go:334] "Generic (PLEG): container finished" podID="162bfd33-a716-4c84-8639-f5047819367f" containerID="ba23870b505b0ef94be94d47cdf7c48d114cb504f16ecc4001bd71cb0cd5ecb0" exitCode=143 Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:11.867228 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"162bfd33-a716-4c84-8639-f5047819367f","Type":"ContainerDied","Data":"90ed83760f8cdd165b83b7d2fba082abfdd5f35f3c31dbc8975f98844edb6cfc"} Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:11.867282 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"162bfd33-a716-4c84-8639-f5047819367f","Type":"ContainerDied","Data":"ba23870b505b0ef94be94d47cdf7c48d114cb504f16ecc4001bd71cb0cd5ecb0"} Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:11.910606 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 08 09:25:11 crc kubenswrapper[4776]: I1208 09:25:11.910871 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.024437 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.024675 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.064289 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-47z2b"] Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.094429 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.292584 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.310403 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.322981 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.391146 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162bfd33-a716-4c84-8639-f5047819367f-combined-ca-bundle\") pod \"162bfd33-a716-4c84-8639-f5047819367f\" (UID: \"162bfd33-a716-4c84-8639-f5047819367f\") " Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.391441 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162bfd33-a716-4c84-8639-f5047819367f-config-data\") pod \"162bfd33-a716-4c84-8639-f5047819367f\" (UID: \"162bfd33-a716-4c84-8639-f5047819367f\") " Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.391633 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/162bfd33-a716-4c84-8639-f5047819367f-logs\") pod \"162bfd33-a716-4c84-8639-f5047819367f\" (UID: \"162bfd33-a716-4c84-8639-f5047819367f\") " Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.391852 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jzcm\" (UniqueName: \"kubernetes.io/projected/162bfd33-a716-4c84-8639-f5047819367f-kube-api-access-7jzcm\") pod \"162bfd33-a716-4c84-8639-f5047819367f\" (UID: \"162bfd33-a716-4c84-8639-f5047819367f\") " Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.399268 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/162bfd33-a716-4c84-8639-f5047819367f-logs" (OuterVolumeSpecName: "logs") pod "162bfd33-a716-4c84-8639-f5047819367f" (UID: "162bfd33-a716-4c84-8639-f5047819367f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.423879 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/162bfd33-a716-4c84-8639-f5047819367f-kube-api-access-7jzcm" (OuterVolumeSpecName: "kube-api-access-7jzcm") pod "162bfd33-a716-4c84-8639-f5047819367f" (UID: "162bfd33-a716-4c84-8639-f5047819367f"). InnerVolumeSpecName "kube-api-access-7jzcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.477987 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/162bfd33-a716-4c84-8639-f5047819367f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "162bfd33-a716-4c84-8639-f5047819367f" (UID: "162bfd33-a716-4c84-8639-f5047819367f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.478196 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-mj9ps"] Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.478753 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" podUID="dc053d70-b785-4b45-91be-49cbd27952d9" containerName="dnsmasq-dns" containerID="cri-o://4907dcf9f2faa7f2cc873138ae497143d0e9422dae53538c76829fc51214ebef" gracePeriod=10 Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.489351 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/162bfd33-a716-4c84-8639-f5047819367f-config-data" (OuterVolumeSpecName: "config-data") pod "162bfd33-a716-4c84-8639-f5047819367f" (UID: "162bfd33-a716-4c84-8639-f5047819367f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.495561 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/162bfd33-a716-4c84-8639-f5047819367f-logs\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.495596 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jzcm\" (UniqueName: \"kubernetes.io/projected/162bfd33-a716-4c84-8639-f5047819367f-kube-api-access-7jzcm\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.495608 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162bfd33-a716-4c84-8639-f5047819367f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.495617 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162bfd33-a716-4c84-8639-f5047819367f-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.909142 4776 generic.go:334] "Generic (PLEG): container finished" podID="dc053d70-b785-4b45-91be-49cbd27952d9" containerID="4907dcf9f2faa7f2cc873138ae497143d0e9422dae53538c76829fc51214ebef" exitCode=0 Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.909498 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" event={"ID":"dc053d70-b785-4b45-91be-49cbd27952d9","Type":"ContainerDied","Data":"4907dcf9f2faa7f2cc873138ae497143d0e9422dae53538c76829fc51214ebef"} Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.917707 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef52d5d-1c6a-4586-aa17-8c8253a53262","Type":"ContainerStarted","Data":"8aa882221fda78196665be8e0570afff1ed74786c32425fea6e82af36b1f67fa"} Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.924727 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.924781 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"162bfd33-a716-4c84-8639-f5047819367f","Type":"ContainerDied","Data":"7fe31b7c7d737dc0a7adef72cbc22694e453c7a89a1396a5e15faed83641ec71"} Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.924842 4776 scope.go:117] "RemoveContainer" containerID="90ed83760f8cdd165b83b7d2fba082abfdd5f35f3c31dbc8975f98844edb6cfc" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.931930 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-47z2b" event={"ID":"4f3c7425-49ed-4491-9422-4d50616e53c4","Type":"ContainerStarted","Data":"884c699486a504319368043a44e5ff9b3a557bb14dc273be9e475f68e6a933fa"} Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.932590 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9qcrg" podUID="effd4b52-0471-4d81-b2bf-9b46ac73db66" containerName="registry-server" containerID="cri-o://2c7d535a175883577fb099ac9e330b9de9d6f7ff9934e8c8e4ea9fb4f3f2ed74" gracePeriod=2 Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.971583 4776 scope.go:117] "RemoveContainer" containerID="ba23870b505b0ef94be94d47cdf7c48d114cb504f16ecc4001bd71cb0cd5ecb0" Dec 08 09:25:12 crc kubenswrapper[4776]: I1208 09:25:12.989372 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.003736 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.005295 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5d325803-a91c-4c5a-8b77-999224ba963d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.233:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.005436 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5d325803-a91c-4c5a-8b77-999224ba963d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.233:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.038229 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.113365 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:25:13 crc kubenswrapper[4776]: E1208 09:25:13.113966 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="162bfd33-a716-4c84-8639-f5047819367f" containerName="nova-metadata-metadata" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.113984 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="162bfd33-a716-4c84-8639-f5047819367f" containerName="nova-metadata-metadata" Dec 08 09:25:13 crc kubenswrapper[4776]: E1208 09:25:13.114015 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="162bfd33-a716-4c84-8639-f5047819367f" containerName="nova-metadata-log" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.114023 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="162bfd33-a716-4c84-8639-f5047819367f" containerName="nova-metadata-log" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.114324 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="162bfd33-a716-4c84-8639-f5047819367f" containerName="nova-metadata-log" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.114351 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="162bfd33-a716-4c84-8639-f5047819367f" containerName="nova-metadata-metadata" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.115763 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.123887 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.124467 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.133476 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.222726 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wjhx\" (UniqueName: \"kubernetes.io/projected/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-kube-api-access-9wjhx\") pod \"nova-metadata-0\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.222837 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.222873 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-config-data\") pod \"nova-metadata-0\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.222901 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.222949 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-logs\") pod \"nova-metadata-0\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.227857 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.324394 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-dns-svc\") pod \"dc053d70-b785-4b45-91be-49cbd27952d9\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.324477 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-dns-swift-storage-0\") pod \"dc053d70-b785-4b45-91be-49cbd27952d9\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.324592 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgsgl\" (UniqueName: \"kubernetes.io/projected/dc053d70-b785-4b45-91be-49cbd27952d9-kube-api-access-kgsgl\") pod \"dc053d70-b785-4b45-91be-49cbd27952d9\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.324691 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-ovsdbserver-sb\") pod \"dc053d70-b785-4b45-91be-49cbd27952d9\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.324717 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-config\") pod \"dc053d70-b785-4b45-91be-49cbd27952d9\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.324734 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-ovsdbserver-nb\") pod \"dc053d70-b785-4b45-91be-49cbd27952d9\" (UID: \"dc053d70-b785-4b45-91be-49cbd27952d9\") " Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.325045 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.325100 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-logs\") pod \"nova-metadata-0\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.325254 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wjhx\" (UniqueName: \"kubernetes.io/projected/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-kube-api-access-9wjhx\") pod \"nova-metadata-0\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.325329 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.325354 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-config-data\") pod \"nova-metadata-0\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.334303 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc053d70-b785-4b45-91be-49cbd27952d9-kube-api-access-kgsgl" (OuterVolumeSpecName: "kube-api-access-kgsgl") pod "dc053d70-b785-4b45-91be-49cbd27952d9" (UID: "dc053d70-b785-4b45-91be-49cbd27952d9"). InnerVolumeSpecName "kube-api-access-kgsgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.334755 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-logs\") pod \"nova-metadata-0\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.349387 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.351300 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-config-data\") pod \"nova-metadata-0\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.356424 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.368188 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wjhx\" (UniqueName: \"kubernetes.io/projected/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-kube-api-access-9wjhx\") pod \"nova-metadata-0\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.421089 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dc053d70-b785-4b45-91be-49cbd27952d9" (UID: "dc053d70-b785-4b45-91be-49cbd27952d9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.428632 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgsgl\" (UniqueName: \"kubernetes.io/projected/dc053d70-b785-4b45-91be-49cbd27952d9-kube-api-access-kgsgl\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.428659 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.471842 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.474520 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dc053d70-b785-4b45-91be-49cbd27952d9" (UID: "dc053d70-b785-4b45-91be-49cbd27952d9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.531932 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.572158 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dc053d70-b785-4b45-91be-49cbd27952d9" (UID: "dc053d70-b785-4b45-91be-49cbd27952d9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.581671 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dc053d70-b785-4b45-91be-49cbd27952d9" (UID: "dc053d70-b785-4b45-91be-49cbd27952d9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.595806 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-config" (OuterVolumeSpecName: "config") pod "dc053d70-b785-4b45-91be-49cbd27952d9" (UID: "dc053d70-b785-4b45-91be-49cbd27952d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.634729 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.634766 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.634799 4776 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc053d70-b785-4b45-91be-49cbd27952d9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.782442 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.843700 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/effd4b52-0471-4d81-b2bf-9b46ac73db66-catalog-content\") pod \"effd4b52-0471-4d81-b2bf-9b46ac73db66\" (UID: \"effd4b52-0471-4d81-b2bf-9b46ac73db66\") " Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.843949 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hnrj\" (UniqueName: \"kubernetes.io/projected/effd4b52-0471-4d81-b2bf-9b46ac73db66-kube-api-access-4hnrj\") pod \"effd4b52-0471-4d81-b2bf-9b46ac73db66\" (UID: \"effd4b52-0471-4d81-b2bf-9b46ac73db66\") " Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.855320 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/effd4b52-0471-4d81-b2bf-9b46ac73db66-kube-api-access-4hnrj" (OuterVolumeSpecName: "kube-api-access-4hnrj") pod "effd4b52-0471-4d81-b2bf-9b46ac73db66" (UID: "effd4b52-0471-4d81-b2bf-9b46ac73db66"). InnerVolumeSpecName "kube-api-access-4hnrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.957632 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/effd4b52-0471-4d81-b2bf-9b46ac73db66-utilities\") pod \"effd4b52-0471-4d81-b2bf-9b46ac73db66\" (UID: \"effd4b52-0471-4d81-b2bf-9b46ac73db66\") " Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.958689 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hnrj\" (UniqueName: \"kubernetes.io/projected/effd4b52-0471-4d81-b2bf-9b46ac73db66-kube-api-access-4hnrj\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.958868 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/effd4b52-0471-4d81-b2bf-9b46ac73db66-utilities" (OuterVolumeSpecName: "utilities") pod "effd4b52-0471-4d81-b2bf-9b46ac73db66" (UID: "effd4b52-0471-4d81-b2bf-9b46ac73db66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.971800 4776 generic.go:334] "Generic (PLEG): container finished" podID="effd4b52-0471-4d81-b2bf-9b46ac73db66" containerID="2c7d535a175883577fb099ac9e330b9de9d6f7ff9934e8c8e4ea9fb4f3f2ed74" exitCode=0 Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.971861 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qcrg" event={"ID":"effd4b52-0471-4d81-b2bf-9b46ac73db66","Type":"ContainerDied","Data":"2c7d535a175883577fb099ac9e330b9de9d6f7ff9934e8c8e4ea9fb4f3f2ed74"} Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.971890 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qcrg" event={"ID":"effd4b52-0471-4d81-b2bf-9b46ac73db66","Type":"ContainerDied","Data":"433413c18af34192251de4926e27f251afe9b6bed6df2810f35ac9d10843aedd"} Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.971907 4776 scope.go:117] "RemoveContainer" containerID="2c7d535a175883577fb099ac9e330b9de9d6f7ff9934e8c8e4ea9fb4f3f2ed74" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.972013 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qcrg" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.988312 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" event={"ID":"dc053d70-b785-4b45-91be-49cbd27952d9","Type":"ContainerDied","Data":"8939b4776c486ec837a2d88d9d011a3784a3ebce21c71b3c742e0a23388732a5"} Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.988326 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" Dec 08 09:25:13 crc kubenswrapper[4776]: I1208 09:25:13.991736 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/effd4b52-0471-4d81-b2bf-9b46ac73db66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "effd4b52-0471-4d81-b2bf-9b46ac73db66" (UID: "effd4b52-0471-4d81-b2bf-9b46ac73db66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:13.997601 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef52d5d-1c6a-4586-aa17-8c8253a53262","Type":"ContainerStarted","Data":"646836525a446b068e3151718ebebfcc3c3efd7400b3f77a14ed0d387525e01e"} Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:13.997646 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef52d5d-1c6a-4586-aa17-8c8253a53262","Type":"ContainerStarted","Data":"6c62245ebb8c72b32f93d1b13b07b9058dee4c2d948da5e63277902020064833"} Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.038486 4776 scope.go:117] "RemoveContainer" containerID="6f62cccd22294b109ae29089055740c58bddc6ebe1da4e0532ac21c2b37320a2" Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.058266 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-mj9ps"] Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.067921 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/effd4b52-0471-4d81-b2bf-9b46ac73db66-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.067948 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/effd4b52-0471-4d81-b2bf-9b46ac73db66-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.069560 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-mj9ps"] Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.173906 4776 scope.go:117] "RemoveContainer" containerID="bbcda11ac2edecd998914d0452359b729cd9a725fc82d588eb86ed2742b9ff5d" Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.177478 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.204474 4776 scope.go:117] "RemoveContainer" containerID="2c7d535a175883577fb099ac9e330b9de9d6f7ff9934e8c8e4ea9fb4f3f2ed74" Dec 08 09:25:14 crc kubenswrapper[4776]: E1208 09:25:14.210254 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c7d535a175883577fb099ac9e330b9de9d6f7ff9934e8c8e4ea9fb4f3f2ed74\": container with ID starting with 2c7d535a175883577fb099ac9e330b9de9d6f7ff9934e8c8e4ea9fb4f3f2ed74 not found: ID does not exist" containerID="2c7d535a175883577fb099ac9e330b9de9d6f7ff9934e8c8e4ea9fb4f3f2ed74" Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.210292 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c7d535a175883577fb099ac9e330b9de9d6f7ff9934e8c8e4ea9fb4f3f2ed74"} err="failed to get container status \"2c7d535a175883577fb099ac9e330b9de9d6f7ff9934e8c8e4ea9fb4f3f2ed74\": rpc error: code = NotFound desc = could not find container \"2c7d535a175883577fb099ac9e330b9de9d6f7ff9934e8c8e4ea9fb4f3f2ed74\": container with ID starting with 2c7d535a175883577fb099ac9e330b9de9d6f7ff9934e8c8e4ea9fb4f3f2ed74 not found: ID does not exist" Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.210314 4776 scope.go:117] "RemoveContainer" containerID="6f62cccd22294b109ae29089055740c58bddc6ebe1da4e0532ac21c2b37320a2" Dec 08 09:25:14 crc kubenswrapper[4776]: E1208 09:25:14.212149 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f62cccd22294b109ae29089055740c58bddc6ebe1da4e0532ac21c2b37320a2\": container with ID starting with 6f62cccd22294b109ae29089055740c58bddc6ebe1da4e0532ac21c2b37320a2 not found: ID does not exist" containerID="6f62cccd22294b109ae29089055740c58bddc6ebe1da4e0532ac21c2b37320a2" Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.212188 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f62cccd22294b109ae29089055740c58bddc6ebe1da4e0532ac21c2b37320a2"} err="failed to get container status \"6f62cccd22294b109ae29089055740c58bddc6ebe1da4e0532ac21c2b37320a2\": rpc error: code = NotFound desc = could not find container \"6f62cccd22294b109ae29089055740c58bddc6ebe1da4e0532ac21c2b37320a2\": container with ID starting with 6f62cccd22294b109ae29089055740c58bddc6ebe1da4e0532ac21c2b37320a2 not found: ID does not exist" Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.212204 4776 scope.go:117] "RemoveContainer" containerID="bbcda11ac2edecd998914d0452359b729cd9a725fc82d588eb86ed2742b9ff5d" Dec 08 09:25:14 crc kubenswrapper[4776]: E1208 09:25:14.213195 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbcda11ac2edecd998914d0452359b729cd9a725fc82d588eb86ed2742b9ff5d\": container with ID starting with bbcda11ac2edecd998914d0452359b729cd9a725fc82d588eb86ed2742b9ff5d not found: ID does not exist" containerID="bbcda11ac2edecd998914d0452359b729cd9a725fc82d588eb86ed2742b9ff5d" Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.213221 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbcda11ac2edecd998914d0452359b729cd9a725fc82d588eb86ed2742b9ff5d"} err="failed to get container status \"bbcda11ac2edecd998914d0452359b729cd9a725fc82d588eb86ed2742b9ff5d\": rpc error: code = NotFound desc = could not find container \"bbcda11ac2edecd998914d0452359b729cd9a725fc82d588eb86ed2742b9ff5d\": container with ID starting with bbcda11ac2edecd998914d0452359b729cd9a725fc82d588eb86ed2742b9ff5d not found: ID does not exist" Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.213234 4776 scope.go:117] "RemoveContainer" containerID="4907dcf9f2faa7f2cc873138ae497143d0e9422dae53538c76829fc51214ebef" Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.247223 4776 scope.go:117] "RemoveContainer" containerID="8682930b1b479cacb27372ed25dc1d31db5c639561dc0106988ecdc8f469a90d" Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.305399 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qcrg"] Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.325475 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9qcrg"] Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.367012 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="162bfd33-a716-4c84-8639-f5047819367f" path="/var/lib/kubelet/pods/162bfd33-a716-4c84-8639-f5047819367f/volumes" Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.367719 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc053d70-b785-4b45-91be-49cbd27952d9" path="/var/lib/kubelet/pods/dc053d70-b785-4b45-91be-49cbd27952d9/volumes" Dec 08 09:25:14 crc kubenswrapper[4776]: I1208 09:25:14.368319 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="effd4b52-0471-4d81-b2bf-9b46ac73db66" path="/var/lib/kubelet/pods/effd4b52-0471-4d81-b2bf-9b46ac73db66/volumes" Dec 08 09:25:15 crc kubenswrapper[4776]: I1208 09:25:15.017795 4776 generic.go:334] "Generic (PLEG): container finished" podID="d88c7e2b-caa4-4d68-acc2-1483da2dfef3" containerID="bf7d40f85562b59617eff8f372ce1b5dc118aa06a87cd416ec83aa2c3a74f745" exitCode=0 Dec 08 09:25:15 crc kubenswrapper[4776]: I1208 09:25:15.017938 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v54bb" event={"ID":"d88c7e2b-caa4-4d68-acc2-1483da2dfef3","Type":"ContainerDied","Data":"bf7d40f85562b59617eff8f372ce1b5dc118aa06a87cd416ec83aa2c3a74f745"} Dec 08 09:25:15 crc kubenswrapper[4776]: I1208 09:25:15.020598 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75","Type":"ContainerStarted","Data":"566b9a12cdc4d3a523516e3eb314fd76b07b5488750d4f4487dca745285b784e"} Dec 08 09:25:15 crc kubenswrapper[4776]: I1208 09:25:15.020631 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75","Type":"ContainerStarted","Data":"e7da692e168add470413301a1d9a5ebea9f92b9765e5d17088e2ca307bbeb6a0"} Dec 08 09:25:15 crc kubenswrapper[4776]: I1208 09:25:15.020641 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75","Type":"ContainerStarted","Data":"fdbbd87bfd7fc7240ba99ff7b39d57f0b78075a9df701f85146e1073a88bca77"} Dec 08 09:25:15 crc kubenswrapper[4776]: I1208 09:25:15.111230 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.111217563 podStartE2EDuration="3.111217563s" podCreationTimestamp="2025-12-08 09:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:25:15.102284703 +0000 UTC m=+1591.365509725" watchObservedRunningTime="2025-12-08 09:25:15.111217563 +0000 UTC m=+1591.374442585" Dec 08 09:25:16 crc kubenswrapper[4776]: I1208 09:25:16.049328 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef52d5d-1c6a-4586-aa17-8c8253a53262","Type":"ContainerStarted","Data":"3f5694ecb1a6ccaa0fa96e50aa464c4d6b30fd55ace8cb713e383d0037d6c549"} Dec 08 09:25:16 crc kubenswrapper[4776]: I1208 09:25:16.050374 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.051872 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.691600691 podStartE2EDuration="8.051855852s" podCreationTimestamp="2025-12-08 09:25:09 +0000 UTC" firstStartedPulling="2025-12-08 09:25:10.948481982 +0000 UTC m=+1587.211707004" lastFinishedPulling="2025-12-08 09:25:15.308737143 +0000 UTC m=+1591.571962165" observedRunningTime="2025-12-08 09:25:16.07044888 +0000 UTC m=+1592.333673922" watchObservedRunningTime="2025-12-08 09:25:17.051855852 +0000 UTC m=+1593.315080874" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.072623 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ljzdd"] Dec 08 09:25:17 crc kubenswrapper[4776]: E1208 09:25:17.073323 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="effd4b52-0471-4d81-b2bf-9b46ac73db66" containerName="extract-utilities" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.073341 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="effd4b52-0471-4d81-b2bf-9b46ac73db66" containerName="extract-utilities" Dec 08 09:25:17 crc kubenswrapper[4776]: E1208 09:25:17.073371 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="effd4b52-0471-4d81-b2bf-9b46ac73db66" containerName="extract-content" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.073383 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="effd4b52-0471-4d81-b2bf-9b46ac73db66" containerName="extract-content" Dec 08 09:25:17 crc kubenswrapper[4776]: E1208 09:25:17.073415 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc053d70-b785-4b45-91be-49cbd27952d9" containerName="dnsmasq-dns" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.073424 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc053d70-b785-4b45-91be-49cbd27952d9" containerName="dnsmasq-dns" Dec 08 09:25:17 crc kubenswrapper[4776]: E1208 09:25:17.073453 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc053d70-b785-4b45-91be-49cbd27952d9" containerName="init" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.073462 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc053d70-b785-4b45-91be-49cbd27952d9" containerName="init" Dec 08 09:25:17 crc kubenswrapper[4776]: E1208 09:25:17.073477 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="effd4b52-0471-4d81-b2bf-9b46ac73db66" containerName="registry-server" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.073486 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="effd4b52-0471-4d81-b2bf-9b46ac73db66" containerName="registry-server" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.073810 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="effd4b52-0471-4d81-b2bf-9b46ac73db66" containerName="registry-server" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.073836 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc053d70-b785-4b45-91be-49cbd27952d9" containerName="dnsmasq-dns" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.075855 4776 generic.go:334] "Generic (PLEG): container finished" podID="5756d118-f614-4000-82d2-ffa1623179cd" containerID="254e6b53d20f5019a28759d11c91d73a6eaf8a3751091df3d4e8bb0cb0912d0c" exitCode=0 Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.076414 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dl2hd" event={"ID":"5756d118-f614-4000-82d2-ffa1623179cd","Type":"ContainerDied","Data":"254e6b53d20f5019a28759d11c91d73a6eaf8a3751091df3d4e8bb0cb0912d0c"} Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.076986 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.095933 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ljzdd"] Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.154455 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcvqx\" (UniqueName: \"kubernetes.io/projected/b81998e9-151a-47c9-a3ca-c678fd6aeb96-kube-api-access-tcvqx\") pod \"community-operators-ljzdd\" (UID: \"b81998e9-151a-47c9-a3ca-c678fd6aeb96\") " pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.154537 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81998e9-151a-47c9-a3ca-c678fd6aeb96-catalog-content\") pod \"community-operators-ljzdd\" (UID: \"b81998e9-151a-47c9-a3ca-c678fd6aeb96\") " pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.154719 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81998e9-151a-47c9-a3ca-c678fd6aeb96-utilities\") pod \"community-operators-ljzdd\" (UID: \"b81998e9-151a-47c9-a3ca-c678fd6aeb96\") " pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.256635 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81998e9-151a-47c9-a3ca-c678fd6aeb96-utilities\") pod \"community-operators-ljzdd\" (UID: \"b81998e9-151a-47c9-a3ca-c678fd6aeb96\") " pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.256715 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcvqx\" (UniqueName: \"kubernetes.io/projected/b81998e9-151a-47c9-a3ca-c678fd6aeb96-kube-api-access-tcvqx\") pod \"community-operators-ljzdd\" (UID: \"b81998e9-151a-47c9-a3ca-c678fd6aeb96\") " pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.256772 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81998e9-151a-47c9-a3ca-c678fd6aeb96-catalog-content\") pod \"community-operators-ljzdd\" (UID: \"b81998e9-151a-47c9-a3ca-c678fd6aeb96\") " pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.257111 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81998e9-151a-47c9-a3ca-c678fd6aeb96-utilities\") pod \"community-operators-ljzdd\" (UID: \"b81998e9-151a-47c9-a3ca-c678fd6aeb96\") " pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.257229 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81998e9-151a-47c9-a3ca-c678fd6aeb96-catalog-content\") pod \"community-operators-ljzdd\" (UID: \"b81998e9-151a-47c9-a3ca-c678fd6aeb96\") " pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.287687 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcvqx\" (UniqueName: \"kubernetes.io/projected/b81998e9-151a-47c9-a3ca-c678fd6aeb96-kube-api-access-tcvqx\") pod \"community-operators-ljzdd\" (UID: \"b81998e9-151a-47c9-a3ca-c678fd6aeb96\") " pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.410921 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:17 crc kubenswrapper[4776]: I1208 09:25:17.758685 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7756b9d78c-mj9ps" podUID="dc053d70-b785-4b45-91be-49cbd27952d9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.210:5353: i/o timeout" Dec 08 09:25:18 crc kubenswrapper[4776]: I1208 09:25:18.473730 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 08 09:25:18 crc kubenswrapper[4776]: I1208 09:25:18.474388 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 08 09:25:18 crc kubenswrapper[4776]: I1208 09:25:18.735140 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dl2hd" Dec 08 09:25:18 crc kubenswrapper[4776]: I1208 09:25:18.793085 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqm8k\" (UniqueName: \"kubernetes.io/projected/5756d118-f614-4000-82d2-ffa1623179cd-kube-api-access-zqm8k\") pod \"5756d118-f614-4000-82d2-ffa1623179cd\" (UID: \"5756d118-f614-4000-82d2-ffa1623179cd\") " Dec 08 09:25:18 crc kubenswrapper[4776]: I1208 09:25:18.793142 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-scripts\") pod \"5756d118-f614-4000-82d2-ffa1623179cd\" (UID: \"5756d118-f614-4000-82d2-ffa1623179cd\") " Dec 08 09:25:18 crc kubenswrapper[4776]: I1208 09:25:18.793313 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-combined-ca-bundle\") pod \"5756d118-f614-4000-82d2-ffa1623179cd\" (UID: \"5756d118-f614-4000-82d2-ffa1623179cd\") " Dec 08 09:25:18 crc kubenswrapper[4776]: I1208 09:25:18.793481 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-config-data\") pod \"5756d118-f614-4000-82d2-ffa1623179cd\" (UID: \"5756d118-f614-4000-82d2-ffa1623179cd\") " Dec 08 09:25:18 crc kubenswrapper[4776]: I1208 09:25:18.800444 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5756d118-f614-4000-82d2-ffa1623179cd-kube-api-access-zqm8k" (OuterVolumeSpecName: "kube-api-access-zqm8k") pod "5756d118-f614-4000-82d2-ffa1623179cd" (UID: "5756d118-f614-4000-82d2-ffa1623179cd"). InnerVolumeSpecName "kube-api-access-zqm8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:18 crc kubenswrapper[4776]: I1208 09:25:18.804547 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-scripts" (OuterVolumeSpecName: "scripts") pod "5756d118-f614-4000-82d2-ffa1623179cd" (UID: "5756d118-f614-4000-82d2-ffa1623179cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:18 crc kubenswrapper[4776]: I1208 09:25:18.853022 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5756d118-f614-4000-82d2-ffa1623179cd" (UID: "5756d118-f614-4000-82d2-ffa1623179cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:18 crc kubenswrapper[4776]: I1208 09:25:18.853344 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-config-data" (OuterVolumeSpecName: "config-data") pod "5756d118-f614-4000-82d2-ffa1623179cd" (UID: "5756d118-f614-4000-82d2-ffa1623179cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:18 crc kubenswrapper[4776]: I1208 09:25:18.896725 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:18 crc kubenswrapper[4776]: I1208 09:25:18.896770 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:18 crc kubenswrapper[4776]: I1208 09:25:18.896788 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqm8k\" (UniqueName: \"kubernetes.io/projected/5756d118-f614-4000-82d2-ffa1623179cd-kube-api-access-zqm8k\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:18 crc kubenswrapper[4776]: I1208 09:25:18.896801 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5756d118-f614-4000-82d2-ffa1623179cd-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.102210 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dl2hd" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.102507 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dl2hd" event={"ID":"5756d118-f614-4000-82d2-ffa1623179cd","Type":"ContainerDied","Data":"42bd218007224e606a6431ae28b9825d4585d06eb4850f65f079508190da2edc"} Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.102551 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42bd218007224e606a6431ae28b9825d4585d06eb4850f65f079508190da2edc" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.182644 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 08 09:25:19 crc kubenswrapper[4776]: E1208 09:25:19.183276 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5756d118-f614-4000-82d2-ffa1623179cd" containerName="nova-cell1-conductor-db-sync" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.183295 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5756d118-f614-4000-82d2-ffa1623179cd" containerName="nova-cell1-conductor-db-sync" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.183577 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5756d118-f614-4000-82d2-ffa1623179cd" containerName="nova-cell1-conductor-db-sync" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.184429 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.186428 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.209310 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.304857 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9791ac59-89ef-4429-b797-d89d7ce62024-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9791ac59-89ef-4429-b797-d89d7ce62024\") " pod="openstack/nova-cell1-conductor-0" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.305001 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64zpx\" (UniqueName: \"kubernetes.io/projected/9791ac59-89ef-4429-b797-d89d7ce62024-kube-api-access-64zpx\") pod \"nova-cell1-conductor-0\" (UID: \"9791ac59-89ef-4429-b797-d89d7ce62024\") " pod="openstack/nova-cell1-conductor-0" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.305063 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9791ac59-89ef-4429-b797-d89d7ce62024-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9791ac59-89ef-4429-b797-d89d7ce62024\") " pod="openstack/nova-cell1-conductor-0" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.406664 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9791ac59-89ef-4429-b797-d89d7ce62024-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9791ac59-89ef-4429-b797-d89d7ce62024\") " pod="openstack/nova-cell1-conductor-0" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.407283 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64zpx\" (UniqueName: \"kubernetes.io/projected/9791ac59-89ef-4429-b797-d89d7ce62024-kube-api-access-64zpx\") pod \"nova-cell1-conductor-0\" (UID: \"9791ac59-89ef-4429-b797-d89d7ce62024\") " pod="openstack/nova-cell1-conductor-0" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.407347 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9791ac59-89ef-4429-b797-d89d7ce62024-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9791ac59-89ef-4429-b797-d89d7ce62024\") " pod="openstack/nova-cell1-conductor-0" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.412400 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9791ac59-89ef-4429-b797-d89d7ce62024-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9791ac59-89ef-4429-b797-d89d7ce62024\") " pod="openstack/nova-cell1-conductor-0" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.412417 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9791ac59-89ef-4429-b797-d89d7ce62024-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9791ac59-89ef-4429-b797-d89d7ce62024\") " pod="openstack/nova-cell1-conductor-0" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.431670 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64zpx\" (UniqueName: \"kubernetes.io/projected/9791ac59-89ef-4429-b797-d89d7ce62024-kube-api-access-64zpx\") pod \"nova-cell1-conductor-0\" (UID: \"9791ac59-89ef-4429-b797-d89d7ce62024\") " pod="openstack/nova-cell1-conductor-0" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.515166 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.587260 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v54bb" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.723099 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-scripts\") pod \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\" (UID: \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\") " Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.723698 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-config-data\") pod \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\" (UID: \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\") " Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.725301 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-combined-ca-bundle\") pod \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\" (UID: \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\") " Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.725532 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxs2c\" (UniqueName: \"kubernetes.io/projected/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-kube-api-access-cxs2c\") pod \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\" (UID: \"d88c7e2b-caa4-4d68-acc2-1483da2dfef3\") " Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.728992 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-scripts" (OuterVolumeSpecName: "scripts") pod "d88c7e2b-caa4-4d68-acc2-1483da2dfef3" (UID: "d88c7e2b-caa4-4d68-acc2-1483da2dfef3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.731887 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-kube-api-access-cxs2c" (OuterVolumeSpecName: "kube-api-access-cxs2c") pod "d88c7e2b-caa4-4d68-acc2-1483da2dfef3" (UID: "d88c7e2b-caa4-4d68-acc2-1483da2dfef3"). InnerVolumeSpecName "kube-api-access-cxs2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.778329 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-config-data" (OuterVolumeSpecName: "config-data") pod "d88c7e2b-caa4-4d68-acc2-1483da2dfef3" (UID: "d88c7e2b-caa4-4d68-acc2-1483da2dfef3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.783827 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d88c7e2b-caa4-4d68-acc2-1483da2dfef3" (UID: "d88c7e2b-caa4-4d68-acc2-1483da2dfef3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.829290 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.829320 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.829364 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxs2c\" (UniqueName: \"kubernetes.io/projected/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-kube-api-access-cxs2c\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:19 crc kubenswrapper[4776]: I1208 09:25:19.829374 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d88c7e2b-caa4-4d68-acc2-1483da2dfef3-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:20 crc kubenswrapper[4776]: I1208 09:25:20.120537 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-47z2b" event={"ID":"4f3c7425-49ed-4491-9422-4d50616e53c4","Type":"ContainerStarted","Data":"3bffbe9698f2e2ccf2cad57c49c83041fd2508e934573699704b5affd85f6027"} Dec 08 09:25:20 crc kubenswrapper[4776]: I1208 09:25:20.127063 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 08 09:25:20 crc kubenswrapper[4776]: I1208 09:25:20.129595 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v54bb" event={"ID":"d88c7e2b-caa4-4d68-acc2-1483da2dfef3","Type":"ContainerDied","Data":"e214b7cfb399e3befb67bad9702061a23721c8411131d72dfbc07162b8f4c215"} Dec 08 09:25:20 crc kubenswrapper[4776]: I1208 09:25:20.129636 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e214b7cfb399e3befb67bad9702061a23721c8411131d72dfbc07162b8f4c215" Dec 08 09:25:20 crc kubenswrapper[4776]: I1208 09:25:20.129701 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v54bb" Dec 08 09:25:20 crc kubenswrapper[4776]: I1208 09:25:20.161727 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-47z2b" podStartSLOduration=2.6037501240000003 podStartE2EDuration="10.161709133s" podCreationTimestamp="2025-12-08 09:25:10 +0000 UTC" firstStartedPulling="2025-12-08 09:25:12.098378315 +0000 UTC m=+1588.361603337" lastFinishedPulling="2025-12-08 09:25:19.656337324 +0000 UTC m=+1595.919562346" observedRunningTime="2025-12-08 09:25:20.144025429 +0000 UTC m=+1596.407250451" watchObservedRunningTime="2025-12-08 09:25:20.161709133 +0000 UTC m=+1596.424934155" Dec 08 09:25:20 crc kubenswrapper[4776]: I1208 09:25:20.279295 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ljzdd"] Dec 08 09:25:20 crc kubenswrapper[4776]: I1208 09:25:20.706499 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 09:25:20 crc kubenswrapper[4776]: I1208 09:25:20.706913 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="75d3ac7a-8205-4462-8c9d-83029a4deeaf" containerName="nova-scheduler-scheduler" containerID="cri-o://e07d10db2d409eaf812277bc8d45b23e5dbf58cd173497c7d15fd5c3257f5361" gracePeriod=30 Dec 08 09:25:20 crc kubenswrapper[4776]: I1208 09:25:20.721277 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:25:20 crc kubenswrapper[4776]: I1208 09:25:20.721600 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5d325803-a91c-4c5a-8b77-999224ba963d" containerName="nova-api-log" containerID="cri-o://4463d2c51be80f98e9cb0b963df9480af26dc0ac2b01e82a99c3cddf0f3f2f53" gracePeriod=30 Dec 08 09:25:20 crc kubenswrapper[4776]: I1208 09:25:20.721652 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5d325803-a91c-4c5a-8b77-999224ba963d" containerName="nova-api-api" containerID="cri-o://94d975c0078678b47fa426c56e073c14f30298ad22a675567a8f654285452af3" gracePeriod=30 Dec 08 09:25:20 crc kubenswrapper[4776]: I1208 09:25:20.743204 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:25:20 crc kubenswrapper[4776]: I1208 09:25:20.743469 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75" containerName="nova-metadata-log" containerID="cri-o://e7da692e168add470413301a1d9a5ebea9f92b9765e5d17088e2ca307bbeb6a0" gracePeriod=30 Dec 08 09:25:20 crc kubenswrapper[4776]: I1208 09:25:20.744079 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75" containerName="nova-metadata-metadata" containerID="cri-o://566b9a12cdc4d3a523516e3eb314fd76b07b5488750d4f4487dca745285b784e" gracePeriod=30 Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.144042 4776 generic.go:334] "Generic (PLEG): container finished" podID="5d325803-a91c-4c5a-8b77-999224ba963d" containerID="4463d2c51be80f98e9cb0b963df9480af26dc0ac2b01e82a99c3cddf0f3f2f53" exitCode=143 Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.144329 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d325803-a91c-4c5a-8b77-999224ba963d","Type":"ContainerDied","Data":"4463d2c51be80f98e9cb0b963df9480af26dc0ac2b01e82a99c3cddf0f3f2f53"} Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.146155 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9791ac59-89ef-4429-b797-d89d7ce62024","Type":"ContainerStarted","Data":"1aa4b5f564fb1b354ec9dccab6965296d57fb2ef42863ae6c796aba6b8692cad"} Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.146192 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9791ac59-89ef-4429-b797-d89d7ce62024","Type":"ContainerStarted","Data":"04e2a7a212a07fe74425d5f83f771691b9213cd912769a710f175ffedd42c9e2"} Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.147389 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.176463 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.17644599 podStartE2EDuration="2.17644599s" podCreationTimestamp="2025-12-08 09:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:25:21.170966533 +0000 UTC m=+1597.434191555" watchObservedRunningTime="2025-12-08 09:25:21.17644599 +0000 UTC m=+1597.439671012" Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.190403 4776 generic.go:334] "Generic (PLEG): container finished" podID="1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75" containerID="566b9a12cdc4d3a523516e3eb314fd76b07b5488750d4f4487dca745285b784e" exitCode=0 Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.190583 4776 generic.go:334] "Generic (PLEG): container finished" podID="1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75" containerID="e7da692e168add470413301a1d9a5ebea9f92b9765e5d17088e2ca307bbeb6a0" exitCode=143 Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.190484 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75","Type":"ContainerDied","Data":"566b9a12cdc4d3a523516e3eb314fd76b07b5488750d4f4487dca745285b784e"} Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.190789 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75","Type":"ContainerDied","Data":"e7da692e168add470413301a1d9a5ebea9f92b9765e5d17088e2ca307bbeb6a0"} Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.193462 4776 generic.go:334] "Generic (PLEG): container finished" podID="b81998e9-151a-47c9-a3ca-c678fd6aeb96" containerID="8e5cf977a8d58e8c43e0bb1ffbc8ad1693f046d533137f1d00190524437dbd3f" exitCode=0 Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.194977 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljzdd" event={"ID":"b81998e9-151a-47c9-a3ca-c678fd6aeb96","Type":"ContainerDied","Data":"8e5cf977a8d58e8c43e0bb1ffbc8ad1693f046d533137f1d00190524437dbd3f"} Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.195007 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljzdd" event={"ID":"b81998e9-151a-47c9-a3ca-c678fd6aeb96","Type":"ContainerStarted","Data":"19605733de7421e812e6f923ae2fa50f2d06a9aba52244b36c7f6e2ffe404cfe"} Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.526476 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.609458 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wjhx\" (UniqueName: \"kubernetes.io/projected/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-kube-api-access-9wjhx\") pod \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.609791 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-logs\") pod \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.609888 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-config-data\") pod \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.610009 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-nova-metadata-tls-certs\") pod \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.610116 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-combined-ca-bundle\") pod \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\" (UID: \"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75\") " Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.612558 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-logs" (OuterVolumeSpecName: "logs") pod "1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75" (UID: "1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.623814 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-kube-api-access-9wjhx" (OuterVolumeSpecName: "kube-api-access-9wjhx") pod "1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75" (UID: "1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75"). InnerVolumeSpecName "kube-api-access-9wjhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.671235 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-config-data" (OuterVolumeSpecName: "config-data") pod "1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75" (UID: "1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.705361 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75" (UID: "1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.712546 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wjhx\" (UniqueName: \"kubernetes.io/projected/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-kube-api-access-9wjhx\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.712572 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-logs\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.712582 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.712591 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.784410 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75" (UID: "1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:21 crc kubenswrapper[4776]: I1208 09:25:21.833264 4776 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:22 crc kubenswrapper[4776]: E1208 09:25:22.026959 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e07d10db2d409eaf812277bc8d45b23e5dbf58cd173497c7d15fd5c3257f5361" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 08 09:25:22 crc kubenswrapper[4776]: E1208 09:25:22.031630 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e07d10db2d409eaf812277bc8d45b23e5dbf58cd173497c7d15fd5c3257f5361" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 08 09:25:22 crc kubenswrapper[4776]: E1208 09:25:22.033981 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e07d10db2d409eaf812277bc8d45b23e5dbf58cd173497c7d15fd5c3257f5361" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 08 09:25:22 crc kubenswrapper[4776]: E1208 09:25:22.034020 4776 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="75d3ac7a-8205-4462-8c9d-83029a4deeaf" containerName="nova-scheduler-scheduler" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.234244 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.234273 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75","Type":"ContainerDied","Data":"fdbbd87bfd7fc7240ba99ff7b39d57f0b78075a9df701f85146e1073a88bca77"} Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.234334 4776 scope.go:117] "RemoveContainer" containerID="566b9a12cdc4d3a523516e3eb314fd76b07b5488750d4f4487dca745285b784e" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.243346 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljzdd" event={"ID":"b81998e9-151a-47c9-a3ca-c678fd6aeb96","Type":"ContainerStarted","Data":"39ab98ee1e2eee4744337b17a85bfd33ebdf2162e187e455783208ca4f7ff31c"} Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.287230 4776 scope.go:117] "RemoveContainer" containerID="e7da692e168add470413301a1d9a5ebea9f92b9765e5d17088e2ca307bbeb6a0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.367448 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.367519 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.372516 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:25:22 crc kubenswrapper[4776]: E1208 09:25:22.372973 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d88c7e2b-caa4-4d68-acc2-1483da2dfef3" containerName="nova-manage" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.372993 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88c7e2b-caa4-4d68-acc2-1483da2dfef3" containerName="nova-manage" Dec 08 09:25:22 crc kubenswrapper[4776]: E1208 09:25:22.373019 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75" containerName="nova-metadata-log" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.373026 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75" containerName="nova-metadata-log" Dec 08 09:25:22 crc kubenswrapper[4776]: E1208 09:25:22.373036 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75" containerName="nova-metadata-metadata" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.373042 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75" containerName="nova-metadata-metadata" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.375352 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75" containerName="nova-metadata-metadata" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.375385 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75" containerName="nova-metadata-log" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.375403 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="d88c7e2b-caa4-4d68-acc2-1483da2dfef3" containerName="nova-manage" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.382846 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.382974 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.385037 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.387231 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.446037 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.446137 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-config-data\") pod \"nova-metadata-0\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.446208 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7psmc\" (UniqueName: \"kubernetes.io/projected/f8647023-4573-46b1-a713-c153d75d160b-kube-api-access-7psmc\") pod \"nova-metadata-0\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.446272 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.446328 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8647023-4573-46b1-a713-c153d75d160b-logs\") pod \"nova-metadata-0\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.547887 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8647023-4573-46b1-a713-c153d75d160b-logs\") pod \"nova-metadata-0\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.548038 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.548092 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-config-data\") pod \"nova-metadata-0\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.548149 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7psmc\" (UniqueName: \"kubernetes.io/projected/f8647023-4573-46b1-a713-c153d75d160b-kube-api-access-7psmc\") pod \"nova-metadata-0\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.548227 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.548365 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8647023-4573-46b1-a713-c153d75d160b-logs\") pod \"nova-metadata-0\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.555362 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-config-data\") pod \"nova-metadata-0\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.556655 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.556783 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.568239 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7psmc\" (UniqueName: \"kubernetes.io/projected/f8647023-4573-46b1-a713-c153d75d160b-kube-api-access-7psmc\") pod \"nova-metadata-0\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " pod="openstack/nova-metadata-0" Dec 08 09:25:22 crc kubenswrapper[4776]: I1208 09:25:22.705270 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 09:25:23 crc kubenswrapper[4776]: I1208 09:25:23.319317 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:25:23 crc kubenswrapper[4776]: I1208 09:25:23.344232 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:25:23 crc kubenswrapper[4776]: E1208 09:25:23.344522 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:25:23 crc kubenswrapper[4776]: W1208 09:25:23.371333 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice/crio-a51ccb03dee95d3e3f412f85820fb5d8d892c2c34d42ac8b795bb96353511d1e WatchSource:0}: Error finding container a51ccb03dee95d3e3f412f85820fb5d8d892c2c34d42ac8b795bb96353511d1e: Status 404 returned error can't find the container with id a51ccb03dee95d3e3f412f85820fb5d8d892c2c34d42ac8b795bb96353511d1e Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.179151 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.186514 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d3ac7a-8205-4462-8c9d-83029a4deeaf-combined-ca-bundle\") pod \"75d3ac7a-8205-4462-8c9d-83029a4deeaf\" (UID: \"75d3ac7a-8205-4462-8c9d-83029a4deeaf\") " Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.186610 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75d3ac7a-8205-4462-8c9d-83029a4deeaf-config-data\") pod \"75d3ac7a-8205-4462-8c9d-83029a4deeaf\" (UID: \"75d3ac7a-8205-4462-8c9d-83029a4deeaf\") " Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.186802 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-847z2\" (UniqueName: \"kubernetes.io/projected/75d3ac7a-8205-4462-8c9d-83029a4deeaf-kube-api-access-847z2\") pod \"75d3ac7a-8205-4462-8c9d-83029a4deeaf\" (UID: \"75d3ac7a-8205-4462-8c9d-83029a4deeaf\") " Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.194036 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d3ac7a-8205-4462-8c9d-83029a4deeaf-kube-api-access-847z2" (OuterVolumeSpecName: "kube-api-access-847z2") pod "75d3ac7a-8205-4462-8c9d-83029a4deeaf" (UID: "75d3ac7a-8205-4462-8c9d-83029a4deeaf"). InnerVolumeSpecName "kube-api-access-847z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.242007 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d3ac7a-8205-4462-8c9d-83029a4deeaf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75d3ac7a-8205-4462-8c9d-83029a4deeaf" (UID: "75d3ac7a-8205-4462-8c9d-83029a4deeaf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.244378 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d3ac7a-8205-4462-8c9d-83029a4deeaf-config-data" (OuterVolumeSpecName: "config-data") pod "75d3ac7a-8205-4462-8c9d-83029a4deeaf" (UID: "75d3ac7a-8205-4462-8c9d-83029a4deeaf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.269155 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8647023-4573-46b1-a713-c153d75d160b","Type":"ContainerStarted","Data":"2be87160fc4bd4b7e818817c80fc6b49383cb19dc8120ad2c77d39aa32b0bea8"} Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.269270 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8647023-4573-46b1-a713-c153d75d160b","Type":"ContainerStarted","Data":"a51ccb03dee95d3e3f412f85820fb5d8d892c2c34d42ac8b795bb96353511d1e"} Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.274350 4776 generic.go:334] "Generic (PLEG): container finished" podID="5d325803-a91c-4c5a-8b77-999224ba963d" containerID="94d975c0078678b47fa426c56e073c14f30298ad22a675567a8f654285452af3" exitCode=0 Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.274472 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d325803-a91c-4c5a-8b77-999224ba963d","Type":"ContainerDied","Data":"94d975c0078678b47fa426c56e073c14f30298ad22a675567a8f654285452af3"} Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.277155 4776 generic.go:334] "Generic (PLEG): container finished" podID="4f3c7425-49ed-4491-9422-4d50616e53c4" containerID="3bffbe9698f2e2ccf2cad57c49c83041fd2508e934573699704b5affd85f6027" exitCode=0 Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.277300 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-47z2b" event={"ID":"4f3c7425-49ed-4491-9422-4d50616e53c4","Type":"ContainerDied","Data":"3bffbe9698f2e2ccf2cad57c49c83041fd2508e934573699704b5affd85f6027"} Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.279060 4776 generic.go:334] "Generic (PLEG): container finished" podID="75d3ac7a-8205-4462-8c9d-83029a4deeaf" containerID="e07d10db2d409eaf812277bc8d45b23e5dbf58cd173497c7d15fd5c3257f5361" exitCode=0 Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.279131 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"75d3ac7a-8205-4462-8c9d-83029a4deeaf","Type":"ContainerDied","Data":"e07d10db2d409eaf812277bc8d45b23e5dbf58cd173497c7d15fd5c3257f5361"} Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.279164 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"75d3ac7a-8205-4462-8c9d-83029a4deeaf","Type":"ContainerDied","Data":"64fd4ac1e5d1723769f232dd3ae513f476467080638c8c73c84af9c6ad013a9a"} Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.279204 4776 scope.go:117] "RemoveContainer" containerID="e07d10db2d409eaf812277bc8d45b23e5dbf58cd173497c7d15fd5c3257f5361" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.279329 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.288317 4776 generic.go:334] "Generic (PLEG): container finished" podID="b81998e9-151a-47c9-a3ca-c678fd6aeb96" containerID="39ab98ee1e2eee4744337b17a85bfd33ebdf2162e187e455783208ca4f7ff31c" exitCode=0 Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.288361 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljzdd" event={"ID":"b81998e9-151a-47c9-a3ca-c678fd6aeb96","Type":"ContainerDied","Data":"39ab98ee1e2eee4744337b17a85bfd33ebdf2162e187e455783208ca4f7ff31c"} Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.289705 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75d3ac7a-8205-4462-8c9d-83029a4deeaf-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.289721 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-847z2\" (UniqueName: \"kubernetes.io/projected/75d3ac7a-8205-4462-8c9d-83029a4deeaf-kube-api-access-847z2\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.289731 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d3ac7a-8205-4462-8c9d-83029a4deeaf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.305971 4776 scope.go:117] "RemoveContainer" containerID="e07d10db2d409eaf812277bc8d45b23e5dbf58cd173497c7d15fd5c3257f5361" Dec 08 09:25:24 crc kubenswrapper[4776]: E1208 09:25:24.306482 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e07d10db2d409eaf812277bc8d45b23e5dbf58cd173497c7d15fd5c3257f5361\": container with ID starting with e07d10db2d409eaf812277bc8d45b23e5dbf58cd173497c7d15fd5c3257f5361 not found: ID does not exist" containerID="e07d10db2d409eaf812277bc8d45b23e5dbf58cd173497c7d15fd5c3257f5361" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.306539 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07d10db2d409eaf812277bc8d45b23e5dbf58cd173497c7d15fd5c3257f5361"} err="failed to get container status \"e07d10db2d409eaf812277bc8d45b23e5dbf58cd173497c7d15fd5c3257f5361\": rpc error: code = NotFound desc = could not find container \"e07d10db2d409eaf812277bc8d45b23e5dbf58cd173497c7d15fd5c3257f5361\": container with ID starting with e07d10db2d409eaf812277bc8d45b23e5dbf58cd173497c7d15fd5c3257f5361 not found: ID does not exist" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.347619 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.370971 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75" path="/var/lib/kubelet/pods/1b63b1c4-ffcc-4c41-98a2-4ac72a9b4b75/volumes" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.383732 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.402319 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.418360 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 09:25:24 crc kubenswrapper[4776]: E1208 09:25:24.419471 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d3ac7a-8205-4462-8c9d-83029a4deeaf" containerName="nova-scheduler-scheduler" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.419494 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d3ac7a-8205-4462-8c9d-83029a4deeaf" containerName="nova-scheduler-scheduler" Dec 08 09:25:24 crc kubenswrapper[4776]: E1208 09:25:24.419522 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d325803-a91c-4c5a-8b77-999224ba963d" containerName="nova-api-api" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.419531 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d325803-a91c-4c5a-8b77-999224ba963d" containerName="nova-api-api" Dec 08 09:25:24 crc kubenswrapper[4776]: E1208 09:25:24.419562 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d325803-a91c-4c5a-8b77-999224ba963d" containerName="nova-api-log" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.419569 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d325803-a91c-4c5a-8b77-999224ba963d" containerName="nova-api-log" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.419795 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="75d3ac7a-8205-4462-8c9d-83029a4deeaf" containerName="nova-scheduler-scheduler" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.419820 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d325803-a91c-4c5a-8b77-999224ba963d" containerName="nova-api-api" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.419851 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d325803-a91c-4c5a-8b77-999224ba963d" containerName="nova-api-log" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.420727 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.422942 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.452187 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.493318 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d325803-a91c-4c5a-8b77-999224ba963d-config-data\") pod \"5d325803-a91c-4c5a-8b77-999224ba963d\" (UID: \"5d325803-a91c-4c5a-8b77-999224ba963d\") " Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.493475 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d325803-a91c-4c5a-8b77-999224ba963d-logs\") pod \"5d325803-a91c-4c5a-8b77-999224ba963d\" (UID: \"5d325803-a91c-4c5a-8b77-999224ba963d\") " Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.493707 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d325803-a91c-4c5a-8b77-999224ba963d-combined-ca-bundle\") pod \"5d325803-a91c-4c5a-8b77-999224ba963d\" (UID: \"5d325803-a91c-4c5a-8b77-999224ba963d\") " Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.493770 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxn7t\" (UniqueName: \"kubernetes.io/projected/5d325803-a91c-4c5a-8b77-999224ba963d-kube-api-access-wxn7t\") pod \"5d325803-a91c-4c5a-8b77-999224ba963d\" (UID: \"5d325803-a91c-4c5a-8b77-999224ba963d\") " Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.494247 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d325803-a91c-4c5a-8b77-999224ba963d-logs" (OuterVolumeSpecName: "logs") pod "5d325803-a91c-4c5a-8b77-999224ba963d" (UID: "5d325803-a91c-4c5a-8b77-999224ba963d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.494556 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d325803-a91c-4c5a-8b77-999224ba963d-logs\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.498682 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d325803-a91c-4c5a-8b77-999224ba963d-kube-api-access-wxn7t" (OuterVolumeSpecName: "kube-api-access-wxn7t") pod "5d325803-a91c-4c5a-8b77-999224ba963d" (UID: "5d325803-a91c-4c5a-8b77-999224ba963d"). InnerVolumeSpecName "kube-api-access-wxn7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.530345 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d325803-a91c-4c5a-8b77-999224ba963d-config-data" (OuterVolumeSpecName: "config-data") pod "5d325803-a91c-4c5a-8b77-999224ba963d" (UID: "5d325803-a91c-4c5a-8b77-999224ba963d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.533390 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d325803-a91c-4c5a-8b77-999224ba963d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d325803-a91c-4c5a-8b77-999224ba963d" (UID: "5d325803-a91c-4c5a-8b77-999224ba963d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.595915 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff46e335-2c8f-4011-85ac-de45611f8e45-config-data\") pod \"nova-scheduler-0\" (UID: \"ff46e335-2c8f-4011-85ac-de45611f8e45\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.596395 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff46e335-2c8f-4011-85ac-de45611f8e45-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ff46e335-2c8f-4011-85ac-de45611f8e45\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.596513 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49bqs\" (UniqueName: \"kubernetes.io/projected/ff46e335-2c8f-4011-85ac-de45611f8e45-kube-api-access-49bqs\") pod \"nova-scheduler-0\" (UID: \"ff46e335-2c8f-4011-85ac-de45611f8e45\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.596813 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d325803-a91c-4c5a-8b77-999224ba963d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.596831 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxn7t\" (UniqueName: \"kubernetes.io/projected/5d325803-a91c-4c5a-8b77-999224ba963d-kube-api-access-wxn7t\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.596842 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d325803-a91c-4c5a-8b77-999224ba963d-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.699031 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff46e335-2c8f-4011-85ac-de45611f8e45-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ff46e335-2c8f-4011-85ac-de45611f8e45\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.699094 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49bqs\" (UniqueName: \"kubernetes.io/projected/ff46e335-2c8f-4011-85ac-de45611f8e45-kube-api-access-49bqs\") pod \"nova-scheduler-0\" (UID: \"ff46e335-2c8f-4011-85ac-de45611f8e45\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.699185 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff46e335-2c8f-4011-85ac-de45611f8e45-config-data\") pod \"nova-scheduler-0\" (UID: \"ff46e335-2c8f-4011-85ac-de45611f8e45\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.702944 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff46e335-2c8f-4011-85ac-de45611f8e45-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ff46e335-2c8f-4011-85ac-de45611f8e45\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.702957 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff46e335-2c8f-4011-85ac-de45611f8e45-config-data\") pod \"nova-scheduler-0\" (UID: \"ff46e335-2c8f-4011-85ac-de45611f8e45\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.714980 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49bqs\" (UniqueName: \"kubernetes.io/projected/ff46e335-2c8f-4011-85ac-de45611f8e45-kube-api-access-49bqs\") pod \"nova-scheduler-0\" (UID: \"ff46e335-2c8f-4011-85ac-de45611f8e45\") " pod="openstack/nova-scheduler-0" Dec 08 09:25:24 crc kubenswrapper[4776]: I1208 09:25:24.751355 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.255661 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 09:25:25 crc kubenswrapper[4776]: W1208 09:25:25.268559 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice/crio-e51b3546b58f53c9203b61c2ae4ad70e973d001366e664d69fa4752ba7f8148b WatchSource:0}: Error finding container e51b3546b58f53c9203b61c2ae4ad70e973d001366e664d69fa4752ba7f8148b: Status 404 returned error can't find the container with id e51b3546b58f53c9203b61c2ae4ad70e973d001366e664d69fa4752ba7f8148b Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.315469 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff46e335-2c8f-4011-85ac-de45611f8e45","Type":"ContainerStarted","Data":"e51b3546b58f53c9203b61c2ae4ad70e973d001366e664d69fa4752ba7f8148b"} Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.324862 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljzdd" event={"ID":"b81998e9-151a-47c9-a3ca-c678fd6aeb96","Type":"ContainerStarted","Data":"0321213e78ab1c8df5a3e5038b0cff028cfd60ddc71edf9cd59cfa3e6aaeda93"} Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.328270 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8647023-4573-46b1-a713-c153d75d160b","Type":"ContainerStarted","Data":"79f3a895f3e200395ad04fd83a6567b267b9c69b94cebcb3229f29398657dd93"} Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.334671 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.334725 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d325803-a91c-4c5a-8b77-999224ba963d","Type":"ContainerDied","Data":"4a67a4350698ef1fd94d9fb7c93e5e36743ddf81df96a7c150c5b4cc23139efb"} Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.334789 4776 scope.go:117] "RemoveContainer" containerID="94d975c0078678b47fa426c56e073c14f30298ad22a675567a8f654285452af3" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.353828 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ljzdd" podStartSLOduration=4.8197488 podStartE2EDuration="8.353806594s" podCreationTimestamp="2025-12-08 09:25:17 +0000 UTC" firstStartedPulling="2025-12-08 09:25:21.202944261 +0000 UTC m=+1597.466169283" lastFinishedPulling="2025-12-08 09:25:24.737002055 +0000 UTC m=+1601.000227077" observedRunningTime="2025-12-08 09:25:25.338583825 +0000 UTC m=+1601.601808847" watchObservedRunningTime="2025-12-08 09:25:25.353806594 +0000 UTC m=+1601.617031616" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.378565 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.378550689 podStartE2EDuration="3.378550689s" podCreationTimestamp="2025-12-08 09:25:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:25:25.374566581 +0000 UTC m=+1601.637791603" watchObservedRunningTime="2025-12-08 09:25:25.378550689 +0000 UTC m=+1601.641775711" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.448686 4776 scope.go:117] "RemoveContainer" containerID="4463d2c51be80f98e9cb0b963df9480af26dc0ac2b01e82a99c3cddf0f3f2f53" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.490447 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.510657 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.525469 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.527821 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.531701 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.536668 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.730479 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdj8f\" (UniqueName: \"kubernetes.io/projected/2fe2799e-2d16-4569-874b-a0066d38087b-kube-api-access-pdj8f\") pod \"nova-api-0\" (UID: \"2fe2799e-2d16-4569-874b-a0066d38087b\") " pod="openstack/nova-api-0" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.730535 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe2799e-2d16-4569-874b-a0066d38087b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2fe2799e-2d16-4569-874b-a0066d38087b\") " pod="openstack/nova-api-0" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.730759 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe2799e-2d16-4569-874b-a0066d38087b-config-data\") pod \"nova-api-0\" (UID: \"2fe2799e-2d16-4569-874b-a0066d38087b\") " pod="openstack/nova-api-0" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.730806 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fe2799e-2d16-4569-874b-a0066d38087b-logs\") pod \"nova-api-0\" (UID: \"2fe2799e-2d16-4569-874b-a0066d38087b\") " pod="openstack/nova-api-0" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.832474 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe2799e-2d16-4569-874b-a0066d38087b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2fe2799e-2d16-4569-874b-a0066d38087b\") " pod="openstack/nova-api-0" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.832658 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe2799e-2d16-4569-874b-a0066d38087b-config-data\") pod \"nova-api-0\" (UID: \"2fe2799e-2d16-4569-874b-a0066d38087b\") " pod="openstack/nova-api-0" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.832701 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fe2799e-2d16-4569-874b-a0066d38087b-logs\") pod \"nova-api-0\" (UID: \"2fe2799e-2d16-4569-874b-a0066d38087b\") " pod="openstack/nova-api-0" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.832739 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdj8f\" (UniqueName: \"kubernetes.io/projected/2fe2799e-2d16-4569-874b-a0066d38087b-kube-api-access-pdj8f\") pod \"nova-api-0\" (UID: \"2fe2799e-2d16-4569-874b-a0066d38087b\") " pod="openstack/nova-api-0" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.835431 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fe2799e-2d16-4569-874b-a0066d38087b-logs\") pod \"nova-api-0\" (UID: \"2fe2799e-2d16-4569-874b-a0066d38087b\") " pod="openstack/nova-api-0" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.842389 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe2799e-2d16-4569-874b-a0066d38087b-config-data\") pod \"nova-api-0\" (UID: \"2fe2799e-2d16-4569-874b-a0066d38087b\") " pod="openstack/nova-api-0" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.842813 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe2799e-2d16-4569-874b-a0066d38087b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2fe2799e-2d16-4569-874b-a0066d38087b\") " pod="openstack/nova-api-0" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.885726 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdj8f\" (UniqueName: \"kubernetes.io/projected/2fe2799e-2d16-4569-874b-a0066d38087b-kube-api-access-pdj8f\") pod \"nova-api-0\" (UID: \"2fe2799e-2d16-4569-874b-a0066d38087b\") " pod="openstack/nova-api-0" Dec 08 09:25:25 crc kubenswrapper[4776]: I1208 09:25:25.977700 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-47z2b" Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.139863 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsjqx\" (UniqueName: \"kubernetes.io/projected/4f3c7425-49ed-4491-9422-4d50616e53c4-kube-api-access-rsjqx\") pod \"4f3c7425-49ed-4491-9422-4d50616e53c4\" (UID: \"4f3c7425-49ed-4491-9422-4d50616e53c4\") " Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.140355 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-combined-ca-bundle\") pod \"4f3c7425-49ed-4491-9422-4d50616e53c4\" (UID: \"4f3c7425-49ed-4491-9422-4d50616e53c4\") " Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.140458 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-config-data\") pod \"4f3c7425-49ed-4491-9422-4d50616e53c4\" (UID: \"4f3c7425-49ed-4491-9422-4d50616e53c4\") " Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.140567 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-scripts\") pod \"4f3c7425-49ed-4491-9422-4d50616e53c4\" (UID: \"4f3c7425-49ed-4491-9422-4d50616e53c4\") " Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.154361 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f3c7425-49ed-4491-9422-4d50616e53c4-kube-api-access-rsjqx" (OuterVolumeSpecName: "kube-api-access-rsjqx") pod "4f3c7425-49ed-4491-9422-4d50616e53c4" (UID: "4f3c7425-49ed-4491-9422-4d50616e53c4"). InnerVolumeSpecName "kube-api-access-rsjqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.154449 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-scripts" (OuterVolumeSpecName: "scripts") pod "4f3c7425-49ed-4491-9422-4d50616e53c4" (UID: "4f3c7425-49ed-4491-9422-4d50616e53c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.155986 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.174256 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-config-data" (OuterVolumeSpecName: "config-data") pod "4f3c7425-49ed-4491-9422-4d50616e53c4" (UID: "4f3c7425-49ed-4491-9422-4d50616e53c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.199275 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f3c7425-49ed-4491-9422-4d50616e53c4" (UID: "4f3c7425-49ed-4491-9422-4d50616e53c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.243751 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.243868 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.243879 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3c7425-49ed-4491-9422-4d50616e53c4-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.243889 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsjqx\" (UniqueName: \"kubernetes.io/projected/4f3c7425-49ed-4491-9422-4d50616e53c4-kube-api-access-rsjqx\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.354004 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-47z2b" Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.361163 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d325803-a91c-4c5a-8b77-999224ba963d" path="/var/lib/kubelet/pods/5d325803-a91c-4c5a-8b77-999224ba963d/volumes" Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.362660 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75d3ac7a-8205-4462-8c9d-83029a4deeaf" path="/var/lib/kubelet/pods/75d3ac7a-8205-4462-8c9d-83029a4deeaf/volumes" Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.363518 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-47z2b" event={"ID":"4f3c7425-49ed-4491-9422-4d50616e53c4","Type":"ContainerDied","Data":"884c699486a504319368043a44e5ff9b3a557bb14dc273be9e475f68e6a933fa"} Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.363547 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="884c699486a504319368043a44e5ff9b3a557bb14dc273be9e475f68e6a933fa" Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.363706 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff46e335-2c8f-4011-85ac-de45611f8e45","Type":"ContainerStarted","Data":"b65eae25ec32cadfd6e7546c1297efa384bb5d0626ffa9c5dcb1b9ebe1dcb88d"} Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.391408 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.391386924 podStartE2EDuration="2.391386924s" podCreationTimestamp="2025-12-08 09:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:25:26.379896346 +0000 UTC m=+1602.643121388" watchObservedRunningTime="2025-12-08 09:25:26.391386924 +0000 UTC m=+1602.654611946" Dec 08 09:25:26 crc kubenswrapper[4776]: I1208 09:25:26.666582 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:25:26 crc kubenswrapper[4776]: W1208 09:25:26.676666 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-e3f5c2b4235a2950e0e59795cc7b8912b807d9f4fd5a548962dea30abd744634 WatchSource:0}: Error finding container e3f5c2b4235a2950e0e59795cc7b8912b807d9f4fd5a548962dea30abd744634: Status 404 returned error can't find the container with id e3f5c2b4235a2950e0e59795cc7b8912b807d9f4fd5a548962dea30abd744634 Dec 08 09:25:27 crc kubenswrapper[4776]: I1208 09:25:27.391326 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fe2799e-2d16-4569-874b-a0066d38087b","Type":"ContainerStarted","Data":"3d0078c422c14eb90a83825f183764ed8c85b60f82f3c7def9c1d237c12d5859"} Dec 08 09:25:27 crc kubenswrapper[4776]: I1208 09:25:27.392496 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fe2799e-2d16-4569-874b-a0066d38087b","Type":"ContainerStarted","Data":"2e51c1ea01b38323d942e9c96fc1a6ccdffedfd437e17ebbca279d55f9072135"} Dec 08 09:25:27 crc kubenswrapper[4776]: I1208 09:25:27.392594 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fe2799e-2d16-4569-874b-a0066d38087b","Type":"ContainerStarted","Data":"e3f5c2b4235a2950e0e59795cc7b8912b807d9f4fd5a548962dea30abd744634"} Dec 08 09:25:27 crc kubenswrapper[4776]: I1208 09:25:27.411257 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:27 crc kubenswrapper[4776]: I1208 09:25:27.412633 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:27 crc kubenswrapper[4776]: I1208 09:25:27.414217 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.414199587 podStartE2EDuration="2.414199587s" podCreationTimestamp="2025-12-08 09:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:25:27.407033515 +0000 UTC m=+1603.670258537" watchObservedRunningTime="2025-12-08 09:25:27.414199587 +0000 UTC m=+1603.677424609" Dec 08 09:25:27 crc kubenswrapper[4776]: I1208 09:25:27.463316 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:27 crc kubenswrapper[4776]: I1208 09:25:27.706261 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 08 09:25:27 crc kubenswrapper[4776]: I1208 09:25:27.706367 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 08 09:25:29 crc kubenswrapper[4776]: I1208 09:25:29.681402 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 08 09:25:29 crc kubenswrapper[4776]: I1208 09:25:29.751883 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 08 09:25:29 crc kubenswrapper[4776]: I1208 09:25:29.835795 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 08 09:25:29 crc kubenswrapper[4776]: E1208 09:25:29.836638 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3c7425-49ed-4491-9422-4d50616e53c4" containerName="aodh-db-sync" Dec 08 09:25:29 crc kubenswrapper[4776]: I1208 09:25:29.836660 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3c7425-49ed-4491-9422-4d50616e53c4" containerName="aodh-db-sync" Dec 08 09:25:29 crc kubenswrapper[4776]: I1208 09:25:29.836928 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f3c7425-49ed-4491-9422-4d50616e53c4" containerName="aodh-db-sync" Dec 08 09:25:29 crc kubenswrapper[4776]: I1208 09:25:29.840437 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 08 09:25:29 crc kubenswrapper[4776]: I1208 09:25:29.847658 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-rtn2h" Dec 08 09:25:29 crc kubenswrapper[4776]: I1208 09:25:29.847738 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 08 09:25:29 crc kubenswrapper[4776]: I1208 09:25:29.847897 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 08 09:25:29 crc kubenswrapper[4776]: I1208 09:25:29.880551 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 08 09:25:29 crc kubenswrapper[4776]: I1208 09:25:29.944996 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnsl9\" (UniqueName: \"kubernetes.io/projected/18e62c81-f058-4044-9bb3-e64e5892a4e6-kube-api-access-gnsl9\") pod \"aodh-0\" (UID: \"18e62c81-f058-4044-9bb3-e64e5892a4e6\") " pod="openstack/aodh-0" Dec 08 09:25:29 crc kubenswrapper[4776]: I1208 09:25:29.945445 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-config-data\") pod \"aodh-0\" (UID: \"18e62c81-f058-4044-9bb3-e64e5892a4e6\") " pod="openstack/aodh-0" Dec 08 09:25:29 crc kubenswrapper[4776]: I1208 09:25:29.945571 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-scripts\") pod \"aodh-0\" (UID: \"18e62c81-f058-4044-9bb3-e64e5892a4e6\") " pod="openstack/aodh-0" Dec 08 09:25:29 crc kubenswrapper[4776]: I1208 09:25:29.945594 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"18e62c81-f058-4044-9bb3-e64e5892a4e6\") " pod="openstack/aodh-0" Dec 08 09:25:30 crc kubenswrapper[4776]: I1208 09:25:30.048143 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnsl9\" (UniqueName: \"kubernetes.io/projected/18e62c81-f058-4044-9bb3-e64e5892a4e6-kube-api-access-gnsl9\") pod \"aodh-0\" (UID: \"18e62c81-f058-4044-9bb3-e64e5892a4e6\") " pod="openstack/aodh-0" Dec 08 09:25:30 crc kubenswrapper[4776]: I1208 09:25:30.048243 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-config-data\") pod \"aodh-0\" (UID: \"18e62c81-f058-4044-9bb3-e64e5892a4e6\") " pod="openstack/aodh-0" Dec 08 09:25:30 crc kubenswrapper[4776]: I1208 09:25:30.048291 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-scripts\") pod \"aodh-0\" (UID: \"18e62c81-f058-4044-9bb3-e64e5892a4e6\") " pod="openstack/aodh-0" Dec 08 09:25:30 crc kubenswrapper[4776]: I1208 09:25:30.048314 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"18e62c81-f058-4044-9bb3-e64e5892a4e6\") " pod="openstack/aodh-0" Dec 08 09:25:30 crc kubenswrapper[4776]: I1208 09:25:30.055459 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"18e62c81-f058-4044-9bb3-e64e5892a4e6\") " pod="openstack/aodh-0" Dec 08 09:25:30 crc kubenswrapper[4776]: I1208 09:25:30.057998 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-config-data\") pod \"aodh-0\" (UID: \"18e62c81-f058-4044-9bb3-e64e5892a4e6\") " pod="openstack/aodh-0" Dec 08 09:25:30 crc kubenswrapper[4776]: I1208 09:25:30.075274 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-scripts\") pod \"aodh-0\" (UID: \"18e62c81-f058-4044-9bb3-e64e5892a4e6\") " pod="openstack/aodh-0" Dec 08 09:25:30 crc kubenswrapper[4776]: I1208 09:25:30.075694 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnsl9\" (UniqueName: \"kubernetes.io/projected/18e62c81-f058-4044-9bb3-e64e5892a4e6-kube-api-access-gnsl9\") pod \"aodh-0\" (UID: \"18e62c81-f058-4044-9bb3-e64e5892a4e6\") " pod="openstack/aodh-0" Dec 08 09:25:30 crc kubenswrapper[4776]: I1208 09:25:30.191815 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 08 09:25:30 crc kubenswrapper[4776]: I1208 09:25:30.818791 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 08 09:25:30 crc kubenswrapper[4776]: W1208 09:25:30.825700 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e62c81_f058_4044_9bb3_e64e5892a4e6.slice/crio-43f9ca7ae59440bd2eb87b148dac60c8753172b6abb7a34c331734c36b675dea WatchSource:0}: Error finding container 43f9ca7ae59440bd2eb87b148dac60c8753172b6abb7a34c331734c36b675dea: Status 404 returned error can't find the container with id 43f9ca7ae59440bd2eb87b148dac60c8753172b6abb7a34c331734c36b675dea Dec 08 09:25:31 crc kubenswrapper[4776]: I1208 09:25:31.484118 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"18e62c81-f058-4044-9bb3-e64e5892a4e6","Type":"ContainerStarted","Data":"43f9ca7ae59440bd2eb87b148dac60c8753172b6abb7a34c331734c36b675dea"} Dec 08 09:25:32 crc kubenswrapper[4776]: I1208 09:25:32.503875 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"18e62c81-f058-4044-9bb3-e64e5892a4e6","Type":"ContainerStarted","Data":"64b3b939b07a3a8fe0eb2b6fbefd5433d8dc0237a675fa60a0b7dcd76faeaa5a"} Dec 08 09:25:32 crc kubenswrapper[4776]: I1208 09:25:32.570407 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:32 crc kubenswrapper[4776]: I1208 09:25:32.570676 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="ceilometer-central-agent" containerID="cri-o://8aa882221fda78196665be8e0570afff1ed74786c32425fea6e82af36b1f67fa" gracePeriod=30 Dec 08 09:25:32 crc kubenswrapper[4776]: I1208 09:25:32.570738 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="sg-core" containerID="cri-o://646836525a446b068e3151718ebebfcc3c3efd7400b3f77a14ed0d387525e01e" gracePeriod=30 Dec 08 09:25:32 crc kubenswrapper[4776]: I1208 09:25:32.570762 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="ceilometer-notification-agent" containerID="cri-o://6c62245ebb8c72b32f93d1b13b07b9058dee4c2d948da5e63277902020064833" gracePeriod=30 Dec 08 09:25:32 crc kubenswrapper[4776]: I1208 09:25:32.570794 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="proxy-httpd" containerID="cri-o://3f5694ecb1a6ccaa0fa96e50aa464c4d6b30fd55ace8cb713e383d0037d6c549" gracePeriod=30 Dec 08 09:25:32 crc kubenswrapper[4776]: I1208 09:25:32.614458 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 08 09:25:32 crc kubenswrapper[4776]: I1208 09:25:32.705788 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 08 09:25:32 crc kubenswrapper[4776]: I1208 09:25:32.706435 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 08 09:25:33 crc kubenswrapper[4776]: I1208 09:25:33.224955 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 08 09:25:33 crc kubenswrapper[4776]: I1208 09:25:33.518758 4776 generic.go:334] "Generic (PLEG): container finished" podID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerID="3f5694ecb1a6ccaa0fa96e50aa464c4d6b30fd55ace8cb713e383d0037d6c549" exitCode=0 Dec 08 09:25:33 crc kubenswrapper[4776]: I1208 09:25:33.519112 4776 generic.go:334] "Generic (PLEG): container finished" podID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerID="646836525a446b068e3151718ebebfcc3c3efd7400b3f77a14ed0d387525e01e" exitCode=2 Dec 08 09:25:33 crc kubenswrapper[4776]: I1208 09:25:33.519122 4776 generic.go:334] "Generic (PLEG): container finished" podID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerID="8aa882221fda78196665be8e0570afff1ed74786c32425fea6e82af36b1f67fa" exitCode=0 Dec 08 09:25:33 crc kubenswrapper[4776]: I1208 09:25:33.518815 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef52d5d-1c6a-4586-aa17-8c8253a53262","Type":"ContainerDied","Data":"3f5694ecb1a6ccaa0fa96e50aa464c4d6b30fd55ace8cb713e383d0037d6c549"} Dec 08 09:25:33 crc kubenswrapper[4776]: I1208 09:25:33.519276 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef52d5d-1c6a-4586-aa17-8c8253a53262","Type":"ContainerDied","Data":"646836525a446b068e3151718ebebfcc3c3efd7400b3f77a14ed0d387525e01e"} Dec 08 09:25:33 crc kubenswrapper[4776]: I1208 09:25:33.519311 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef52d5d-1c6a-4586-aa17-8c8253a53262","Type":"ContainerDied","Data":"8aa882221fda78196665be8e0570afff1ed74786c32425fea6e82af36b1f67fa"} Dec 08 09:25:33 crc kubenswrapper[4776]: I1208 09:25:33.723389 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f8647023-4573-46b1-a713-c153d75d160b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.244:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 08 09:25:33 crc kubenswrapper[4776]: I1208 09:25:33.741380 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f8647023-4573-46b1-a713-c153d75d160b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.244:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 08 09:25:34 crc kubenswrapper[4776]: I1208 09:25:34.361836 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:25:34 crc kubenswrapper[4776]: E1208 09:25:34.371094 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:25:34 crc kubenswrapper[4776]: I1208 09:25:34.534014 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"18e62c81-f058-4044-9bb3-e64e5892a4e6","Type":"ContainerStarted","Data":"7f5bdee8fb273e9371d024ad04a26b136a3700dce97ab99c9a229cb649912b5b"} Dec 08 09:25:34 crc kubenswrapper[4776]: I1208 09:25:34.753471 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 08 09:25:34 crc kubenswrapper[4776]: I1208 09:25:34.808859 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 08 09:25:35 crc kubenswrapper[4776]: I1208 09:25:35.545105 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"18e62c81-f058-4044-9bb3-e64e5892a4e6","Type":"ContainerStarted","Data":"f514feeb66285d8fc9e0b189ba31e987d1f136c05518d2f1f6f19dd5b51bd824"} Dec 08 09:25:35 crc kubenswrapper[4776]: I1208 09:25:35.574844 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 08 09:25:36 crc kubenswrapper[4776]: I1208 09:25:36.157153 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 08 09:25:36 crc kubenswrapper[4776]: I1208 09:25:36.157237 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 08 09:25:36 crc kubenswrapper[4776]: I1208 09:25:36.603939 4776 generic.go:334] "Generic (PLEG): container finished" podID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerID="6c62245ebb8c72b32f93d1b13b07b9058dee4c2d948da5e63277902020064833" exitCode=0 Dec 08 09:25:36 crc kubenswrapper[4776]: I1208 09:25:36.604191 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef52d5d-1c6a-4586-aa17-8c8253a53262","Type":"ContainerDied","Data":"6c62245ebb8c72b32f93d1b13b07b9058dee4c2d948da5e63277902020064833"} Dec 08 09:25:36 crc kubenswrapper[4776]: I1208 09:25:36.816136 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:25:36 crc kubenswrapper[4776]: I1208 09:25:36.964067 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-config-data\") pod \"eef52d5d-1c6a-4586-aa17-8c8253a53262\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " Dec 08 09:25:36 crc kubenswrapper[4776]: I1208 09:25:36.964110 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-sg-core-conf-yaml\") pod \"eef52d5d-1c6a-4586-aa17-8c8253a53262\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " Dec 08 09:25:36 crc kubenswrapper[4776]: I1208 09:25:36.964433 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2k77\" (UniqueName: \"kubernetes.io/projected/eef52d5d-1c6a-4586-aa17-8c8253a53262-kube-api-access-h2k77\") pod \"eef52d5d-1c6a-4586-aa17-8c8253a53262\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " Dec 08 09:25:36 crc kubenswrapper[4776]: I1208 09:25:36.964473 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef52d5d-1c6a-4586-aa17-8c8253a53262-run-httpd\") pod \"eef52d5d-1c6a-4586-aa17-8c8253a53262\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " Dec 08 09:25:36 crc kubenswrapper[4776]: I1208 09:25:36.964495 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef52d5d-1c6a-4586-aa17-8c8253a53262-log-httpd\") pod \"eef52d5d-1c6a-4586-aa17-8c8253a53262\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " Dec 08 09:25:36 crc kubenswrapper[4776]: I1208 09:25:36.964576 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-scripts\") pod \"eef52d5d-1c6a-4586-aa17-8c8253a53262\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " Dec 08 09:25:36 crc kubenswrapper[4776]: I1208 09:25:36.964599 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-combined-ca-bundle\") pod \"eef52d5d-1c6a-4586-aa17-8c8253a53262\" (UID: \"eef52d5d-1c6a-4586-aa17-8c8253a53262\") " Dec 08 09:25:36 crc kubenswrapper[4776]: I1208 09:25:36.967424 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eef52d5d-1c6a-4586-aa17-8c8253a53262-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eef52d5d-1c6a-4586-aa17-8c8253a53262" (UID: "eef52d5d-1c6a-4586-aa17-8c8253a53262"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:25:36 crc kubenswrapper[4776]: I1208 09:25:36.968035 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eef52d5d-1c6a-4586-aa17-8c8253a53262-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eef52d5d-1c6a-4586-aa17-8c8253a53262" (UID: "eef52d5d-1c6a-4586-aa17-8c8253a53262"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:25:36 crc kubenswrapper[4776]: I1208 09:25:36.971283 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-scripts" (OuterVolumeSpecName: "scripts") pod "eef52d5d-1c6a-4586-aa17-8c8253a53262" (UID: "eef52d5d-1c6a-4586-aa17-8c8253a53262"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:36 crc kubenswrapper[4776]: I1208 09:25:36.971439 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eef52d5d-1c6a-4586-aa17-8c8253a53262-kube-api-access-h2k77" (OuterVolumeSpecName: "kube-api-access-h2k77") pod "eef52d5d-1c6a-4586-aa17-8c8253a53262" (UID: "eef52d5d-1c6a-4586-aa17-8c8253a53262"). InnerVolumeSpecName "kube-api-access-h2k77". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.004266 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eef52d5d-1c6a-4586-aa17-8c8253a53262" (UID: "eef52d5d-1c6a-4586-aa17-8c8253a53262"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.067764 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.067975 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.068072 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2k77\" (UniqueName: \"kubernetes.io/projected/eef52d5d-1c6a-4586-aa17-8c8253a53262-kube-api-access-h2k77\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.068136 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef52d5d-1c6a-4586-aa17-8c8253a53262-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.068221 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef52d5d-1c6a-4586-aa17-8c8253a53262-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.084254 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eef52d5d-1c6a-4586-aa17-8c8253a53262" (UID: "eef52d5d-1c6a-4586-aa17-8c8253a53262"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.106332 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-config-data" (OuterVolumeSpecName: "config-data") pod "eef52d5d-1c6a-4586-aa17-8c8253a53262" (UID: "eef52d5d-1c6a-4586-aa17-8c8253a53262"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.169952 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.170004 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef52d5d-1c6a-4586-aa17-8c8253a53262-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.240678 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2fe2799e-2d16-4569-874b-a0066d38087b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.246:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.240860 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2fe2799e-2d16-4569-874b-a0066d38087b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.246:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.471551 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.524986 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ljzdd"] Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.619649 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ljzdd" podUID="b81998e9-151a-47c9-a3ca-c678fd6aeb96" containerName="registry-server" containerID="cri-o://0321213e78ab1c8df5a3e5038b0cff028cfd60ddc71edf9cd59cfa3e6aaeda93" gracePeriod=2 Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.619773 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.620289 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef52d5d-1c6a-4586-aa17-8c8253a53262","Type":"ContainerDied","Data":"6a779dc9287df2c4cd300466349f4407466c657fabfa7776944bd0dfa0f2c662"} Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.620356 4776 scope.go:117] "RemoveContainer" containerID="3f5694ecb1a6ccaa0fa96e50aa464c4d6b30fd55ace8cb713e383d0037d6c549" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.933550 4776 scope.go:117] "RemoveContainer" containerID="646836525a446b068e3151718ebebfcc3c3efd7400b3f77a14ed0d387525e01e" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.949259 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.965518 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.980260 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:37 crc kubenswrapper[4776]: E1208 09:25:37.980828 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="proxy-httpd" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.980847 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="proxy-httpd" Dec 08 09:25:37 crc kubenswrapper[4776]: E1208 09:25:37.980857 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="ceilometer-central-agent" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.980866 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="ceilometer-central-agent" Dec 08 09:25:37 crc kubenswrapper[4776]: E1208 09:25:37.980896 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="ceilometer-notification-agent" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.980903 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="ceilometer-notification-agent" Dec 08 09:25:37 crc kubenswrapper[4776]: E1208 09:25:37.980922 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="sg-core" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.980927 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="sg-core" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.981145 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="proxy-httpd" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.981159 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="sg-core" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.981189 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="ceilometer-notification-agent" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.981207 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" containerName="ceilometer-central-agent" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.983313 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.984997 4776 scope.go:117] "RemoveContainer" containerID="6c62245ebb8c72b32f93d1b13b07b9058dee4c2d948da5e63277902020064833" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.985881 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.994503 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 09:25:37 crc kubenswrapper[4776]: I1208 09:25:37.998390 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.050272 4776 scope.go:117] "RemoveContainer" containerID="8aa882221fda78196665be8e0570afff1ed74786c32425fea6e82af36b1f67fa" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.092829 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.092910 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91f1949a-9307-4999-b50c-2d5a747e4571-run-httpd\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.093058 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq55z\" (UniqueName: \"kubernetes.io/projected/91f1949a-9307-4999-b50c-2d5a747e4571-kube-api-access-dq55z\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.093095 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91f1949a-9307-4999-b50c-2d5a747e4571-log-httpd\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.093165 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-config-data\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.093208 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-scripts\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.093280 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.102591 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.194774 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81998e9-151a-47c9-a3ca-c678fd6aeb96-catalog-content\") pod \"b81998e9-151a-47c9-a3ca-c678fd6aeb96\" (UID: \"b81998e9-151a-47c9-a3ca-c678fd6aeb96\") " Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.195150 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81998e9-151a-47c9-a3ca-c678fd6aeb96-utilities\") pod \"b81998e9-151a-47c9-a3ca-c678fd6aeb96\" (UID: \"b81998e9-151a-47c9-a3ca-c678fd6aeb96\") " Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.195200 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcvqx\" (UniqueName: \"kubernetes.io/projected/b81998e9-151a-47c9-a3ca-c678fd6aeb96-kube-api-access-tcvqx\") pod \"b81998e9-151a-47c9-a3ca-c678fd6aeb96\" (UID: \"b81998e9-151a-47c9-a3ca-c678fd6aeb96\") " Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.195667 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.195698 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81998e9-151a-47c9-a3ca-c678fd6aeb96-utilities" (OuterVolumeSpecName: "utilities") pod "b81998e9-151a-47c9-a3ca-c678fd6aeb96" (UID: "b81998e9-151a-47c9-a3ca-c678fd6aeb96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.195770 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.195807 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91f1949a-9307-4999-b50c-2d5a747e4571-run-httpd\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.195891 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq55z\" (UniqueName: \"kubernetes.io/projected/91f1949a-9307-4999-b50c-2d5a747e4571-kube-api-access-dq55z\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.195919 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91f1949a-9307-4999-b50c-2d5a747e4571-log-httpd\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.195944 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-config-data\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.195967 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-scripts\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.196039 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81998e9-151a-47c9-a3ca-c678fd6aeb96-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.196483 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91f1949a-9307-4999-b50c-2d5a747e4571-run-httpd\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.196580 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91f1949a-9307-4999-b50c-2d5a747e4571-log-httpd\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.200636 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.205388 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-scripts\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.211651 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-config-data\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.211865 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.213025 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq55z\" (UniqueName: \"kubernetes.io/projected/91f1949a-9307-4999-b50c-2d5a747e4571-kube-api-access-dq55z\") pod \"ceilometer-0\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.213713 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b81998e9-151a-47c9-a3ca-c678fd6aeb96-kube-api-access-tcvqx" (OuterVolumeSpecName: "kube-api-access-tcvqx") pod "b81998e9-151a-47c9-a3ca-c678fd6aeb96" (UID: "b81998e9-151a-47c9-a3ca-c678fd6aeb96"). InnerVolumeSpecName "kube-api-access-tcvqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.245209 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81998e9-151a-47c9-a3ca-c678fd6aeb96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b81998e9-151a-47c9-a3ca-c678fd6aeb96" (UID: "b81998e9-151a-47c9-a3ca-c678fd6aeb96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.297958 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81998e9-151a-47c9-a3ca-c678fd6aeb96-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.298009 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcvqx\" (UniqueName: \"kubernetes.io/projected/b81998e9-151a-47c9-a3ca-c678fd6aeb96-kube-api-access-tcvqx\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.309662 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.365052 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eef52d5d-1c6a-4586-aa17-8c8253a53262" path="/var/lib/kubelet/pods/eef52d5d-1c6a-4586-aa17-8c8253a53262/volumes" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.646382 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"18e62c81-f058-4044-9bb3-e64e5892a4e6","Type":"ContainerStarted","Data":"64831e1fe3bff43093ab2b6e8712d8f7bd0ea7251240539c05884628e2e6a005"} Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.647029 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerName="aodh-api" containerID="cri-o://64b3b939b07a3a8fe0eb2b6fbefd5433d8dc0237a675fa60a0b7dcd76faeaa5a" gracePeriod=30 Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.647798 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerName="aodh-listener" containerID="cri-o://64831e1fe3bff43093ab2b6e8712d8f7bd0ea7251240539c05884628e2e6a005" gracePeriod=30 Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.647867 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerName="aodh-notifier" containerID="cri-o://f514feeb66285d8fc9e0b189ba31e987d1f136c05518d2f1f6f19dd5b51bd824" gracePeriod=30 Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.647925 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerName="aodh-evaluator" containerID="cri-o://7f5bdee8fb273e9371d024ad04a26b136a3700dce97ab99c9a229cb649912b5b" gracePeriod=30 Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.675255 4776 generic.go:334] "Generic (PLEG): container finished" podID="b81998e9-151a-47c9-a3ca-c678fd6aeb96" containerID="0321213e78ab1c8df5a3e5038b0cff028cfd60ddc71edf9cd59cfa3e6aaeda93" exitCode=0 Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.675318 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljzdd" event={"ID":"b81998e9-151a-47c9-a3ca-c678fd6aeb96","Type":"ContainerDied","Data":"0321213e78ab1c8df5a3e5038b0cff028cfd60ddc71edf9cd59cfa3e6aaeda93"} Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.675345 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljzdd" event={"ID":"b81998e9-151a-47c9-a3ca-c678fd6aeb96","Type":"ContainerDied","Data":"19605733de7421e812e6f923ae2fa50f2d06a9aba52244b36c7f6e2ffe404cfe"} Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.675361 4776 scope.go:117] "RemoveContainer" containerID="0321213e78ab1c8df5a3e5038b0cff028cfd60ddc71edf9cd59cfa3e6aaeda93" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.675468 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ljzdd" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.677017 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.989905078 podStartE2EDuration="9.676300702s" podCreationTimestamp="2025-12-08 09:25:29 +0000 UTC" firstStartedPulling="2025-12-08 09:25:30.834068416 +0000 UTC m=+1607.097293438" lastFinishedPulling="2025-12-08 09:25:37.52046404 +0000 UTC m=+1613.783689062" observedRunningTime="2025-12-08 09:25:38.670759635 +0000 UTC m=+1614.933984657" watchObservedRunningTime="2025-12-08 09:25:38.676300702 +0000 UTC m=+1614.939525724" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.770697 4776 scope.go:117] "RemoveContainer" containerID="39ab98ee1e2eee4744337b17a85bfd33ebdf2162e187e455783208ca4f7ff31c" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.770852 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ljzdd"] Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.801822 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ljzdd"] Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.853371 4776 scope.go:117] "RemoveContainer" containerID="8e5cf977a8d58e8c43e0bb1ffbc8ad1693f046d533137f1d00190524437dbd3f" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.907845 4776 scope.go:117] "RemoveContainer" containerID="0321213e78ab1c8df5a3e5038b0cff028cfd60ddc71edf9cd59cfa3e6aaeda93" Dec 08 09:25:38 crc kubenswrapper[4776]: E1208 09:25:38.910752 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0321213e78ab1c8df5a3e5038b0cff028cfd60ddc71edf9cd59cfa3e6aaeda93\": container with ID starting with 0321213e78ab1c8df5a3e5038b0cff028cfd60ddc71edf9cd59cfa3e6aaeda93 not found: ID does not exist" containerID="0321213e78ab1c8df5a3e5038b0cff028cfd60ddc71edf9cd59cfa3e6aaeda93" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.910807 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0321213e78ab1c8df5a3e5038b0cff028cfd60ddc71edf9cd59cfa3e6aaeda93"} err="failed to get container status \"0321213e78ab1c8df5a3e5038b0cff028cfd60ddc71edf9cd59cfa3e6aaeda93\": rpc error: code = NotFound desc = could not find container \"0321213e78ab1c8df5a3e5038b0cff028cfd60ddc71edf9cd59cfa3e6aaeda93\": container with ID starting with 0321213e78ab1c8df5a3e5038b0cff028cfd60ddc71edf9cd59cfa3e6aaeda93 not found: ID does not exist" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.910834 4776 scope.go:117] "RemoveContainer" containerID="39ab98ee1e2eee4744337b17a85bfd33ebdf2162e187e455783208ca4f7ff31c" Dec 08 09:25:38 crc kubenswrapper[4776]: E1208 09:25:38.917798 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39ab98ee1e2eee4744337b17a85bfd33ebdf2162e187e455783208ca4f7ff31c\": container with ID starting with 39ab98ee1e2eee4744337b17a85bfd33ebdf2162e187e455783208ca4f7ff31c not found: ID does not exist" containerID="39ab98ee1e2eee4744337b17a85bfd33ebdf2162e187e455783208ca4f7ff31c" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.917852 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39ab98ee1e2eee4744337b17a85bfd33ebdf2162e187e455783208ca4f7ff31c"} err="failed to get container status \"39ab98ee1e2eee4744337b17a85bfd33ebdf2162e187e455783208ca4f7ff31c\": rpc error: code = NotFound desc = could not find container \"39ab98ee1e2eee4744337b17a85bfd33ebdf2162e187e455783208ca4f7ff31c\": container with ID starting with 39ab98ee1e2eee4744337b17a85bfd33ebdf2162e187e455783208ca4f7ff31c not found: ID does not exist" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.917878 4776 scope.go:117] "RemoveContainer" containerID="8e5cf977a8d58e8c43e0bb1ffbc8ad1693f046d533137f1d00190524437dbd3f" Dec 08 09:25:38 crc kubenswrapper[4776]: E1208 09:25:38.924383 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e5cf977a8d58e8c43e0bb1ffbc8ad1693f046d533137f1d00190524437dbd3f\": container with ID starting with 8e5cf977a8d58e8c43e0bb1ffbc8ad1693f046d533137f1d00190524437dbd3f not found: ID does not exist" containerID="8e5cf977a8d58e8c43e0bb1ffbc8ad1693f046d533137f1d00190524437dbd3f" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.924427 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e5cf977a8d58e8c43e0bb1ffbc8ad1693f046d533137f1d00190524437dbd3f"} err="failed to get container status \"8e5cf977a8d58e8c43e0bb1ffbc8ad1693f046d533137f1d00190524437dbd3f\": rpc error: code = NotFound desc = could not find container \"8e5cf977a8d58e8c43e0bb1ffbc8ad1693f046d533137f1d00190524437dbd3f\": container with ID starting with 8e5cf977a8d58e8c43e0bb1ffbc8ad1693f046d533137f1d00190524437dbd3f not found: ID does not exist" Dec 08 09:25:38 crc kubenswrapper[4776]: I1208 09:25:38.973152 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:39 crc kubenswrapper[4776]: I1208 09:25:39.697995 4776 generic.go:334] "Generic (PLEG): container finished" podID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerID="7f5bdee8fb273e9371d024ad04a26b136a3700dce97ab99c9a229cb649912b5b" exitCode=0 Dec 08 09:25:39 crc kubenswrapper[4776]: I1208 09:25:39.698286 4776 generic.go:334] "Generic (PLEG): container finished" podID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerID="64b3b939b07a3a8fe0eb2b6fbefd5433d8dc0237a675fa60a0b7dcd76faeaa5a" exitCode=0 Dec 08 09:25:39 crc kubenswrapper[4776]: I1208 09:25:39.698071 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"18e62c81-f058-4044-9bb3-e64e5892a4e6","Type":"ContainerDied","Data":"7f5bdee8fb273e9371d024ad04a26b136a3700dce97ab99c9a229cb649912b5b"} Dec 08 09:25:39 crc kubenswrapper[4776]: I1208 09:25:39.698324 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"18e62c81-f058-4044-9bb3-e64e5892a4e6","Type":"ContainerDied","Data":"64b3b939b07a3a8fe0eb2b6fbefd5433d8dc0237a675fa60a0b7dcd76faeaa5a"} Dec 08 09:25:39 crc kubenswrapper[4776]: I1208 09:25:39.701400 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91f1949a-9307-4999-b50c-2d5a747e4571","Type":"ContainerStarted","Data":"f246e6066e84c7582e111a2e7f327024e5a574ed6b2cb5b105ee762db911c7e3"} Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.071469 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.246122 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-combined-ca-bundle\") pod \"8e5b39ee-eda8-48e5-b374-b1330cbb7b08\" (UID: \"8e5b39ee-eda8-48e5-b374-b1330cbb7b08\") " Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.246401 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgsfs\" (UniqueName: \"kubernetes.io/projected/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-kube-api-access-jgsfs\") pod \"8e5b39ee-eda8-48e5-b374-b1330cbb7b08\" (UID: \"8e5b39ee-eda8-48e5-b374-b1330cbb7b08\") " Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.246453 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-config-data\") pod \"8e5b39ee-eda8-48e5-b374-b1330cbb7b08\" (UID: \"8e5b39ee-eda8-48e5-b374-b1330cbb7b08\") " Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.251718 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-kube-api-access-jgsfs" (OuterVolumeSpecName: "kube-api-access-jgsfs") pod "8e5b39ee-eda8-48e5-b374-b1330cbb7b08" (UID: "8e5b39ee-eda8-48e5-b374-b1330cbb7b08"). InnerVolumeSpecName "kube-api-access-jgsfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.281443 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e5b39ee-eda8-48e5-b374-b1330cbb7b08" (UID: "8e5b39ee-eda8-48e5-b374-b1330cbb7b08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.291664 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-config-data" (OuterVolumeSpecName: "config-data") pod "8e5b39ee-eda8-48e5-b374-b1330cbb7b08" (UID: "8e5b39ee-eda8-48e5-b374-b1330cbb7b08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.351073 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgsfs\" (UniqueName: \"kubernetes.io/projected/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-kube-api-access-jgsfs\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.351104 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.351114 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e5b39ee-eda8-48e5-b374-b1330cbb7b08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.360286 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b81998e9-151a-47c9-a3ca-c678fd6aeb96" path="/var/lib/kubelet/pods/b81998e9-151a-47c9-a3ca-c678fd6aeb96/volumes" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.721804 4776 generic.go:334] "Generic (PLEG): container finished" podID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerID="f514feeb66285d8fc9e0b189ba31e987d1f136c05518d2f1f6f19dd5b51bd824" exitCode=0 Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.721948 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"18e62c81-f058-4044-9bb3-e64e5892a4e6","Type":"ContainerDied","Data":"f514feeb66285d8fc9e0b189ba31e987d1f136c05518d2f1f6f19dd5b51bd824"} Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.725114 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91f1949a-9307-4999-b50c-2d5a747e4571","Type":"ContainerStarted","Data":"9dbce126bf8853edf4f1c03952ba57abdf8542625da28eb66fcc8d5b21342589"} Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.725209 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91f1949a-9307-4999-b50c-2d5a747e4571","Type":"ContainerStarted","Data":"a067de7097b95ef850013617661524d3759368af152b1652daee43af5022e6c4"} Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.727086 4776 generic.go:334] "Generic (PLEG): container finished" podID="8e5b39ee-eda8-48e5-b374-b1330cbb7b08" containerID="62b751b12ee0868b33a91f533e59e54e9cb21df1538d1c662c61aeee0f49e77b" exitCode=137 Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.727123 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8e5b39ee-eda8-48e5-b374-b1330cbb7b08","Type":"ContainerDied","Data":"62b751b12ee0868b33a91f533e59e54e9cb21df1538d1c662c61aeee0f49e77b"} Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.727214 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8e5b39ee-eda8-48e5-b374-b1330cbb7b08","Type":"ContainerDied","Data":"7b64da380396f6648898561ca56790d4c958ecb39013a4d8ea56be5d835c9eee"} Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.727255 4776 scope.go:117] "RemoveContainer" containerID="62b751b12ee0868b33a91f533e59e54e9cb21df1538d1c662c61aeee0f49e77b" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.727416 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.766964 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.769761 4776 scope.go:117] "RemoveContainer" containerID="62b751b12ee0868b33a91f533e59e54e9cb21df1538d1c662c61aeee0f49e77b" Dec 08 09:25:40 crc kubenswrapper[4776]: E1208 09:25:40.771564 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b751b12ee0868b33a91f533e59e54e9cb21df1538d1c662c61aeee0f49e77b\": container with ID starting with 62b751b12ee0868b33a91f533e59e54e9cb21df1538d1c662c61aeee0f49e77b not found: ID does not exist" containerID="62b751b12ee0868b33a91f533e59e54e9cb21df1538d1c662c61aeee0f49e77b" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.771609 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b751b12ee0868b33a91f533e59e54e9cb21df1538d1c662c61aeee0f49e77b"} err="failed to get container status \"62b751b12ee0868b33a91f533e59e54e9cb21df1538d1c662c61aeee0f49e77b\": rpc error: code = NotFound desc = could not find container \"62b751b12ee0868b33a91f533e59e54e9cb21df1538d1c662c61aeee0f49e77b\": container with ID starting with 62b751b12ee0868b33a91f533e59e54e9cb21df1538d1c662c61aeee0f49e77b not found: ID does not exist" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.784432 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.801902 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 09:25:40 crc kubenswrapper[4776]: E1208 09:25:40.802446 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81998e9-151a-47c9-a3ca-c678fd6aeb96" containerName="extract-content" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.802463 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81998e9-151a-47c9-a3ca-c678fd6aeb96" containerName="extract-content" Dec 08 09:25:40 crc kubenswrapper[4776]: E1208 09:25:40.802479 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81998e9-151a-47c9-a3ca-c678fd6aeb96" containerName="registry-server" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.802486 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81998e9-151a-47c9-a3ca-c678fd6aeb96" containerName="registry-server" Dec 08 09:25:40 crc kubenswrapper[4776]: E1208 09:25:40.802495 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81998e9-151a-47c9-a3ca-c678fd6aeb96" containerName="extract-utilities" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.802501 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81998e9-151a-47c9-a3ca-c678fd6aeb96" containerName="extract-utilities" Dec 08 09:25:40 crc kubenswrapper[4776]: E1208 09:25:40.802510 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5b39ee-eda8-48e5-b374-b1330cbb7b08" containerName="nova-cell1-novncproxy-novncproxy" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.802516 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5b39ee-eda8-48e5-b374-b1330cbb7b08" containerName="nova-cell1-novncproxy-novncproxy" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.802726 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="b81998e9-151a-47c9-a3ca-c678fd6aeb96" containerName="registry-server" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.802756 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e5b39ee-eda8-48e5-b374-b1330cbb7b08" containerName="nova-cell1-novncproxy-novncproxy" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.804559 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.807246 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.807401 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.808101 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.819271 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.964808 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dptx\" (UniqueName: \"kubernetes.io/projected/f485895b-f2aa-427f-b592-811f09089a49-kube-api-access-5dptx\") pod \"nova-cell1-novncproxy-0\" (UID: \"f485895b-f2aa-427f-b592-811f09089a49\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.965290 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f485895b-f2aa-427f-b592-811f09089a49-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f485895b-f2aa-427f-b592-811f09089a49\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.965370 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f485895b-f2aa-427f-b592-811f09089a49-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f485895b-f2aa-427f-b592-811f09089a49\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.965422 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f485895b-f2aa-427f-b592-811f09089a49-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f485895b-f2aa-427f-b592-811f09089a49\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:40 crc kubenswrapper[4776]: I1208 09:25:40.965443 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f485895b-f2aa-427f-b592-811f09089a49-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f485895b-f2aa-427f-b592-811f09089a49\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:41 crc kubenswrapper[4776]: I1208 09:25:41.066888 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f485895b-f2aa-427f-b592-811f09089a49-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f485895b-f2aa-427f-b592-811f09089a49\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:41 crc kubenswrapper[4776]: I1208 09:25:41.066959 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f485895b-f2aa-427f-b592-811f09089a49-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f485895b-f2aa-427f-b592-811f09089a49\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:41 crc kubenswrapper[4776]: I1208 09:25:41.066979 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f485895b-f2aa-427f-b592-811f09089a49-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f485895b-f2aa-427f-b592-811f09089a49\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:41 crc kubenswrapper[4776]: I1208 09:25:41.067066 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dptx\" (UniqueName: \"kubernetes.io/projected/f485895b-f2aa-427f-b592-811f09089a49-kube-api-access-5dptx\") pod \"nova-cell1-novncproxy-0\" (UID: \"f485895b-f2aa-427f-b592-811f09089a49\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:41 crc kubenswrapper[4776]: I1208 09:25:41.067145 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f485895b-f2aa-427f-b592-811f09089a49-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f485895b-f2aa-427f-b592-811f09089a49\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:41 crc kubenswrapper[4776]: I1208 09:25:41.073355 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f485895b-f2aa-427f-b592-811f09089a49-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f485895b-f2aa-427f-b592-811f09089a49\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:41 crc kubenswrapper[4776]: I1208 09:25:41.074366 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f485895b-f2aa-427f-b592-811f09089a49-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f485895b-f2aa-427f-b592-811f09089a49\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:41 crc kubenswrapper[4776]: I1208 09:25:41.074759 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f485895b-f2aa-427f-b592-811f09089a49-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f485895b-f2aa-427f-b592-811f09089a49\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:41 crc kubenswrapper[4776]: I1208 09:25:41.076675 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f485895b-f2aa-427f-b592-811f09089a49-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f485895b-f2aa-427f-b592-811f09089a49\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:41 crc kubenswrapper[4776]: I1208 09:25:41.097965 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dptx\" (UniqueName: \"kubernetes.io/projected/f485895b-f2aa-427f-b592-811f09089a49-kube-api-access-5dptx\") pod \"nova-cell1-novncproxy-0\" (UID: \"f485895b-f2aa-427f-b592-811f09089a49\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:41 crc kubenswrapper[4776]: I1208 09:25:41.144797 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:41 crc kubenswrapper[4776]: I1208 09:25:41.662343 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 09:25:41 crc kubenswrapper[4776]: W1208 09:25:41.675736 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf485895b_f2aa_427f_b592_811f09089a49.slice/crio-ac1ff0530232497fc41b3e5ce0de2e9f3b9c4ce050dfc2fb0d7e6d5d7dcb485c WatchSource:0}: Error finding container ac1ff0530232497fc41b3e5ce0de2e9f3b9c4ce050dfc2fb0d7e6d5d7dcb485c: Status 404 returned error can't find the container with id ac1ff0530232497fc41b3e5ce0de2e9f3b9c4ce050dfc2fb0d7e6d5d7dcb485c Dec 08 09:25:41 crc kubenswrapper[4776]: I1208 09:25:41.740600 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f485895b-f2aa-427f-b592-811f09089a49","Type":"ContainerStarted","Data":"ac1ff0530232497fc41b3e5ce0de2e9f3b9c4ce050dfc2fb0d7e6d5d7dcb485c"} Dec 08 09:25:41 crc kubenswrapper[4776]: I1208 09:25:41.741961 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91f1949a-9307-4999-b50c-2d5a747e4571","Type":"ContainerStarted","Data":"f67f3ab1a1668d9e04c5cd79e1030d334f5d8f0cd3b3f1af06d6b25d1e5bc706"} Dec 08 09:25:42 crc kubenswrapper[4776]: I1208 09:25:42.355442 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e5b39ee-eda8-48e5-b374-b1330cbb7b08" path="/var/lib/kubelet/pods/8e5b39ee-eda8-48e5-b374-b1330cbb7b08/volumes" Dec 08 09:25:42 crc kubenswrapper[4776]: I1208 09:25:42.754836 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91f1949a-9307-4999-b50c-2d5a747e4571","Type":"ContainerStarted","Data":"278c3db6ab4e3d2e6cb99faf31918c9c5eab103a82a74caa6c858958825f8d14"} Dec 08 09:25:42 crc kubenswrapper[4776]: I1208 09:25:42.754975 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 09:25:42 crc kubenswrapper[4776]: I1208 09:25:42.755958 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f485895b-f2aa-427f-b592-811f09089a49","Type":"ContainerStarted","Data":"18d685ccaa25dbc5e9a3703b3f83780dc6e2e751f9d5170cac410e9522d91117"} Dec 08 09:25:42 crc kubenswrapper[4776]: I1208 09:25:42.788869 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.360362707 podStartE2EDuration="5.788852278s" podCreationTimestamp="2025-12-08 09:25:37 +0000 UTC" firstStartedPulling="2025-12-08 09:25:38.963857068 +0000 UTC m=+1615.227082090" lastFinishedPulling="2025-12-08 09:25:42.392346639 +0000 UTC m=+1618.655571661" observedRunningTime="2025-12-08 09:25:42.78594834 +0000 UTC m=+1619.049173362" watchObservedRunningTime="2025-12-08 09:25:42.788852278 +0000 UTC m=+1619.052077300" Dec 08 09:25:42 crc kubenswrapper[4776]: I1208 09:25:42.820067 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.820047345 podStartE2EDuration="2.820047345s" podCreationTimestamp="2025-12-08 09:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:25:42.80944075 +0000 UTC m=+1619.072665772" watchObservedRunningTime="2025-12-08 09:25:42.820047345 +0000 UTC m=+1619.083272367" Dec 08 09:25:42 crc kubenswrapper[4776]: I1208 09:25:42.840949 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 08 09:25:42 crc kubenswrapper[4776]: I1208 09:25:42.854630 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 08 09:25:42 crc kubenswrapper[4776]: I1208 09:25:42.862320 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 08 09:25:43 crc kubenswrapper[4776]: I1208 09:25:43.783089 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 08 09:25:46 crc kubenswrapper[4776]: I1208 09:25:46.145608 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:46 crc kubenswrapper[4776]: I1208 09:25:46.160541 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 08 09:25:46 crc kubenswrapper[4776]: I1208 09:25:46.160834 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 08 09:25:46 crc kubenswrapper[4776]: I1208 09:25:46.162405 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 08 09:25:46 crc kubenswrapper[4776]: I1208 09:25:46.163452 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 08 09:25:46 crc kubenswrapper[4776]: I1208 09:25:46.823506 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 08 09:25:46 crc kubenswrapper[4776]: I1208 09:25:46.913699 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.121820 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn"] Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.132119 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.134630 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn"] Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.238423 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s64h8\" (UniqueName: \"kubernetes.io/projected/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-kube-api-access-s64h8\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.239074 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.239237 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.239456 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.239553 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-config\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.239665 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.341941 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s64h8\" (UniqueName: \"kubernetes.io/projected/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-kube-api-access-s64h8\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.342196 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.342300 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.342454 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.342529 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-config\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.342624 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.343490 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.343507 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.343536 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-config\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.343641 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.344914 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.363957 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s64h8\" (UniqueName: \"kubernetes.io/projected/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-kube-api-access-s64h8\") pod \"dnsmasq-dns-6b7bbf7cf9-rgtmn\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.450233 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:47 crc kubenswrapper[4776]: I1208 09:25:47.940999 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn"] Dec 08 09:25:48 crc kubenswrapper[4776]: I1208 09:25:48.344632 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:25:48 crc kubenswrapper[4776]: E1208 09:25:48.345343 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:25:48 crc kubenswrapper[4776]: I1208 09:25:48.844690 4776 generic.go:334] "Generic (PLEG): container finished" podID="9f6c0b05-3b5e-465b-8edc-f276b2ff8c42" containerID="40af2f431091aebd32f2fa532e0dde354c0315c8ecef6bc5bd3d8b323982e4f1" exitCode=0 Dec 08 09:25:48 crc kubenswrapper[4776]: I1208 09:25:48.844739 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" event={"ID":"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42","Type":"ContainerDied","Data":"40af2f431091aebd32f2fa532e0dde354c0315c8ecef6bc5bd3d8b323982e4f1"} Dec 08 09:25:48 crc kubenswrapper[4776]: I1208 09:25:48.844789 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" event={"ID":"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42","Type":"ContainerStarted","Data":"51d3ee02d4b53d9166424cae686014239ebb0846e93c617f0113c453ef823857"} Dec 08 09:25:49 crc kubenswrapper[4776]: I1208 09:25:49.857483 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" event={"ID":"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42","Type":"ContainerStarted","Data":"0341717a9debbbd6ed2e3f02feaee7c8666680fa5d4f9aa99ce1fdfeef78fb8c"} Dec 08 09:25:49 crc kubenswrapper[4776]: I1208 09:25:49.857926 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:49 crc kubenswrapper[4776]: I1208 09:25:49.889200 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" podStartSLOduration=2.889159263 podStartE2EDuration="2.889159263s" podCreationTimestamp="2025-12-08 09:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:25:49.874003726 +0000 UTC m=+1626.137228738" watchObservedRunningTime="2025-12-08 09:25:49.889159263 +0000 UTC m=+1626.152384295" Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.078987 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.079350 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" containerName="ceilometer-central-agent" containerID="cri-o://a067de7097b95ef850013617661524d3759368af152b1652daee43af5022e6c4" gracePeriod=30 Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.080720 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" containerName="proxy-httpd" containerID="cri-o://278c3db6ab4e3d2e6cb99faf31918c9c5eab103a82a74caa6c858958825f8d14" gracePeriod=30 Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.080894 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" containerName="ceilometer-notification-agent" containerID="cri-o://9dbce126bf8853edf4f1c03952ba57abdf8542625da28eb66fcc8d5b21342589" gracePeriod=30 Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.080949 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" containerName="sg-core" containerID="cri-o://f67f3ab1a1668d9e04c5cd79e1030d334f5d8f0cd3b3f1af06d6b25d1e5bc706" gracePeriod=30 Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.313311 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.314303 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2fe2799e-2d16-4569-874b-a0066d38087b" containerName="nova-api-log" containerID="cri-o://2e51c1ea01b38323d942e9c96fc1a6ccdffedfd437e17ebbca279d55f9072135" gracePeriod=30 Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.314437 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2fe2799e-2d16-4569-874b-a0066d38087b" containerName="nova-api-api" containerID="cri-o://3d0078c422c14eb90a83825f183764ed8c85b60f82f3c7def9c1d237c12d5859" gracePeriod=30 Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.874097 4776 generic.go:334] "Generic (PLEG): container finished" podID="2fe2799e-2d16-4569-874b-a0066d38087b" containerID="2e51c1ea01b38323d942e9c96fc1a6ccdffedfd437e17ebbca279d55f9072135" exitCode=143 Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.874213 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fe2799e-2d16-4569-874b-a0066d38087b","Type":"ContainerDied","Data":"2e51c1ea01b38323d942e9c96fc1a6ccdffedfd437e17ebbca279d55f9072135"} Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.877611 4776 generic.go:334] "Generic (PLEG): container finished" podID="91f1949a-9307-4999-b50c-2d5a747e4571" containerID="278c3db6ab4e3d2e6cb99faf31918c9c5eab103a82a74caa6c858958825f8d14" exitCode=0 Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.877639 4776 generic.go:334] "Generic (PLEG): container finished" podID="91f1949a-9307-4999-b50c-2d5a747e4571" containerID="f67f3ab1a1668d9e04c5cd79e1030d334f5d8f0cd3b3f1af06d6b25d1e5bc706" exitCode=2 Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.877646 4776 generic.go:334] "Generic (PLEG): container finished" podID="91f1949a-9307-4999-b50c-2d5a747e4571" containerID="9dbce126bf8853edf4f1c03952ba57abdf8542625da28eb66fcc8d5b21342589" exitCode=0 Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.877653 4776 generic.go:334] "Generic (PLEG): container finished" podID="91f1949a-9307-4999-b50c-2d5a747e4571" containerID="a067de7097b95ef850013617661524d3759368af152b1652daee43af5022e6c4" exitCode=0 Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.878112 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91f1949a-9307-4999-b50c-2d5a747e4571","Type":"ContainerDied","Data":"278c3db6ab4e3d2e6cb99faf31918c9c5eab103a82a74caa6c858958825f8d14"} Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.878151 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91f1949a-9307-4999-b50c-2d5a747e4571","Type":"ContainerDied","Data":"f67f3ab1a1668d9e04c5cd79e1030d334f5d8f0cd3b3f1af06d6b25d1e5bc706"} Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.878161 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91f1949a-9307-4999-b50c-2d5a747e4571","Type":"ContainerDied","Data":"9dbce126bf8853edf4f1c03952ba57abdf8542625da28eb66fcc8d5b21342589"} Dec 08 09:25:50 crc kubenswrapper[4776]: I1208 09:25:50.878182 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91f1949a-9307-4999-b50c-2d5a747e4571","Type":"ContainerDied","Data":"a067de7097b95ef850013617661524d3759368af152b1652daee43af5022e6c4"} Dec 08 09:25:51 crc kubenswrapper[4776]: I1208 09:25:51.146537 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.435777 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.459134 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.531874 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.608933 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-sg-core-conf-yaml\") pod \"91f1949a-9307-4999-b50c-2d5a747e4571\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.609418 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91f1949a-9307-4999-b50c-2d5a747e4571-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "91f1949a-9307-4999-b50c-2d5a747e4571" (UID: "91f1949a-9307-4999-b50c-2d5a747e4571"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.610200 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91f1949a-9307-4999-b50c-2d5a747e4571-log-httpd\") pod \"91f1949a-9307-4999-b50c-2d5a747e4571\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.610310 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-config-data\") pod \"91f1949a-9307-4999-b50c-2d5a747e4571\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.610387 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq55z\" (UniqueName: \"kubernetes.io/projected/91f1949a-9307-4999-b50c-2d5a747e4571-kube-api-access-dq55z\") pod \"91f1949a-9307-4999-b50c-2d5a747e4571\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.610475 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-scripts\") pod \"91f1949a-9307-4999-b50c-2d5a747e4571\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.610501 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-combined-ca-bundle\") pod \"91f1949a-9307-4999-b50c-2d5a747e4571\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.610607 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91f1949a-9307-4999-b50c-2d5a747e4571-run-httpd\") pod \"91f1949a-9307-4999-b50c-2d5a747e4571\" (UID: \"91f1949a-9307-4999-b50c-2d5a747e4571\") " Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.611286 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91f1949a-9307-4999-b50c-2d5a747e4571-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.620664 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91f1949a-9307-4999-b50c-2d5a747e4571-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "91f1949a-9307-4999-b50c-2d5a747e4571" (UID: "91f1949a-9307-4999-b50c-2d5a747e4571"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.621647 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91f1949a-9307-4999-b50c-2d5a747e4571-kube-api-access-dq55z" (OuterVolumeSpecName: "kube-api-access-dq55z") pod "91f1949a-9307-4999-b50c-2d5a747e4571" (UID: "91f1949a-9307-4999-b50c-2d5a747e4571"). InnerVolumeSpecName "kube-api-access-dq55z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.623263 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-scripts" (OuterVolumeSpecName: "scripts") pod "91f1949a-9307-4999-b50c-2d5a747e4571" (UID: "91f1949a-9307-4999-b50c-2d5a747e4571"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.644728 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "91f1949a-9307-4999-b50c-2d5a747e4571" (UID: "91f1949a-9307-4999-b50c-2d5a747e4571"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.713795 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq55z\" (UniqueName: \"kubernetes.io/projected/91f1949a-9307-4999-b50c-2d5a747e4571-kube-api-access-dq55z\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.713825 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.713833 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91f1949a-9307-4999-b50c-2d5a747e4571-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.713842 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.767110 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-kdpqs"] Dec 08 09:25:52 crc kubenswrapper[4776]: E1208 09:25:52.768597 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" containerName="sg-core" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.768624 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" containerName="sg-core" Dec 08 09:25:52 crc kubenswrapper[4776]: E1208 09:25:52.768649 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" containerName="ceilometer-central-agent" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.768659 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" containerName="ceilometer-central-agent" Dec 08 09:25:52 crc kubenswrapper[4776]: E1208 09:25:52.768694 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" containerName="proxy-httpd" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.768703 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" containerName="proxy-httpd" Dec 08 09:25:52 crc kubenswrapper[4776]: E1208 09:25:52.768727 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" containerName="ceilometer-notification-agent" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.768739 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" containerName="ceilometer-notification-agent" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.769109 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" containerName="ceilometer-central-agent" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.769133 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" containerName="sg-core" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.769156 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" containerName="proxy-httpd" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.770327 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" containerName="ceilometer-notification-agent" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.774350 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-kdpqs" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.779450 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.779664 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.786310 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91f1949a-9307-4999-b50c-2d5a747e4571" (UID: "91f1949a-9307-4999-b50c-2d5a747e4571"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.804081 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-kdpqs"] Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.829260 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.845321 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-config-data" (OuterVolumeSpecName: "config-data") pod "91f1949a-9307-4999-b50c-2d5a747e4571" (UID: "91f1949a-9307-4999-b50c-2d5a747e4571"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.917421 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.918353 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91f1949a-9307-4999-b50c-2d5a747e4571","Type":"ContainerDied","Data":"f246e6066e84c7582e111a2e7f327024e5a574ed6b2cb5b105ee762db911c7e3"} Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.918396 4776 scope.go:117] "RemoveContainer" containerID="278c3db6ab4e3d2e6cb99faf31918c9c5eab103a82a74caa6c858958825f8d14" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.934599 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-kdpqs\" (UID: \"bc988dd0-b7f8-4793-8922-238ec7c3081b\") " pod="openstack/nova-cell1-cell-mapping-kdpqs" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.934693 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpkfg\" (UniqueName: \"kubernetes.io/projected/bc988dd0-b7f8-4793-8922-238ec7c3081b-kube-api-access-cpkfg\") pod \"nova-cell1-cell-mapping-kdpqs\" (UID: \"bc988dd0-b7f8-4793-8922-238ec7c3081b\") " pod="openstack/nova-cell1-cell-mapping-kdpqs" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.934736 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-scripts\") pod \"nova-cell1-cell-mapping-kdpqs\" (UID: \"bc988dd0-b7f8-4793-8922-238ec7c3081b\") " pod="openstack/nova-cell1-cell-mapping-kdpqs" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.934756 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-config-data\") pod \"nova-cell1-cell-mapping-kdpqs\" (UID: \"bc988dd0-b7f8-4793-8922-238ec7c3081b\") " pod="openstack/nova-cell1-cell-mapping-kdpqs" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.934875 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91f1949a-9307-4999-b50c-2d5a747e4571-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.988246 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:52 crc kubenswrapper[4776]: I1208 09:25:52.994361 4776 scope.go:117] "RemoveContainer" containerID="f67f3ab1a1668d9e04c5cd79e1030d334f5d8f0cd3b3f1af06d6b25d1e5bc706" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.010244 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.022310 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.025692 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.029561 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.029810 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.036165 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-scripts\") pod \"nova-cell1-cell-mapping-kdpqs\" (UID: \"bc988dd0-b7f8-4793-8922-238ec7c3081b\") " pod="openstack/nova-cell1-cell-mapping-kdpqs" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.036213 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-config-data\") pod \"nova-cell1-cell-mapping-kdpqs\" (UID: \"bc988dd0-b7f8-4793-8922-238ec7c3081b\") " pod="openstack/nova-cell1-cell-mapping-kdpqs" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.036375 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-kdpqs\" (UID: \"bc988dd0-b7f8-4793-8922-238ec7c3081b\") " pod="openstack/nova-cell1-cell-mapping-kdpqs" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.036447 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpkfg\" (UniqueName: \"kubernetes.io/projected/bc988dd0-b7f8-4793-8922-238ec7c3081b-kube-api-access-cpkfg\") pod \"nova-cell1-cell-mapping-kdpqs\" (UID: \"bc988dd0-b7f8-4793-8922-238ec7c3081b\") " pod="openstack/nova-cell1-cell-mapping-kdpqs" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.041827 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-scripts\") pod \"nova-cell1-cell-mapping-kdpqs\" (UID: \"bc988dd0-b7f8-4793-8922-238ec7c3081b\") " pod="openstack/nova-cell1-cell-mapping-kdpqs" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.045872 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-config-data\") pod \"nova-cell1-cell-mapping-kdpqs\" (UID: \"bc988dd0-b7f8-4793-8922-238ec7c3081b\") " pod="openstack/nova-cell1-cell-mapping-kdpqs" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.047924 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.053085 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-kdpqs\" (UID: \"bc988dd0-b7f8-4793-8922-238ec7c3081b\") " pod="openstack/nova-cell1-cell-mapping-kdpqs" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.066806 4776 scope.go:117] "RemoveContainer" containerID="9dbce126bf8853edf4f1c03952ba57abdf8542625da28eb66fcc8d5b21342589" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.068612 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpkfg\" (UniqueName: \"kubernetes.io/projected/bc988dd0-b7f8-4793-8922-238ec7c3081b-kube-api-access-cpkfg\") pod \"nova-cell1-cell-mapping-kdpqs\" (UID: \"bc988dd0-b7f8-4793-8922-238ec7c3081b\") " pod="openstack/nova-cell1-cell-mapping-kdpqs" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.098406 4776 scope.go:117] "RemoveContainer" containerID="a067de7097b95ef850013617661524d3759368af152b1652daee43af5022e6c4" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.106094 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-kdpqs" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.138577 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-config-data\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.138665 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tg2x\" (UniqueName: \"kubernetes.io/projected/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-kube-api-access-9tg2x\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.138918 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-run-httpd\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.139206 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-log-httpd\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.139260 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.139378 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.139422 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-scripts\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.241794 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-log-httpd\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.242215 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.242284 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.242317 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-scripts\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.242383 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-config-data\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.242466 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tg2x\" (UniqueName: \"kubernetes.io/projected/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-kube-api-access-9tg2x\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.242510 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-run-httpd\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.243218 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-log-httpd\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.243270 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-run-httpd\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.247645 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.251031 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.255054 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-config-data\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.255629 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-scripts\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.260019 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tg2x\" (UniqueName: \"kubernetes.io/projected/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-kube-api-access-9tg2x\") pod \"ceilometer-0\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.353683 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.630852 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-kdpqs"] Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.853049 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.930980 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2","Type":"ContainerStarted","Data":"2f4a7f03efa29b59d81df582b5c478847c5d89a4bb37990aa5e85d63a0ad4344"} Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.942552 4776 generic.go:334] "Generic (PLEG): container finished" podID="2fe2799e-2d16-4569-874b-a0066d38087b" containerID="3d0078c422c14eb90a83825f183764ed8c85b60f82f3c7def9c1d237c12d5859" exitCode=0 Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.942614 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fe2799e-2d16-4569-874b-a0066d38087b","Type":"ContainerDied","Data":"3d0078c422c14eb90a83825f183764ed8c85b60f82f3c7def9c1d237c12d5859"} Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.948509 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-kdpqs" event={"ID":"bc988dd0-b7f8-4793-8922-238ec7c3081b","Type":"ContainerStarted","Data":"2df6524946b75a97f411eca2a816359eeaeed3ff3ede4991c33760287b254865"} Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.980365 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.988135 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-kdpqs" podStartSLOduration=1.988117709 podStartE2EDuration="1.988117709s" podCreationTimestamp="2025-12-08 09:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:25:53.964842134 +0000 UTC m=+1630.228067146" watchObservedRunningTime="2025-12-08 09:25:53.988117709 +0000 UTC m=+1630.251342731" Dec 08 09:25:53 crc kubenswrapper[4776]: I1208 09:25:53.995741 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.059945 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe2799e-2d16-4569-874b-a0066d38087b-combined-ca-bundle\") pod \"2fe2799e-2d16-4569-874b-a0066d38087b\" (UID: \"2fe2799e-2d16-4569-874b-a0066d38087b\") " Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.060013 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe2799e-2d16-4569-874b-a0066d38087b-config-data\") pod \"2fe2799e-2d16-4569-874b-a0066d38087b\" (UID: \"2fe2799e-2d16-4569-874b-a0066d38087b\") " Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.060126 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fe2799e-2d16-4569-874b-a0066d38087b-logs\") pod \"2fe2799e-2d16-4569-874b-a0066d38087b\" (UID: \"2fe2799e-2d16-4569-874b-a0066d38087b\") " Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.060260 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdj8f\" (UniqueName: \"kubernetes.io/projected/2fe2799e-2d16-4569-874b-a0066d38087b-kube-api-access-pdj8f\") pod \"2fe2799e-2d16-4569-874b-a0066d38087b\" (UID: \"2fe2799e-2d16-4569-874b-a0066d38087b\") " Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.061654 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fe2799e-2d16-4569-874b-a0066d38087b-logs" (OuterVolumeSpecName: "logs") pod "2fe2799e-2d16-4569-874b-a0066d38087b" (UID: "2fe2799e-2d16-4569-874b-a0066d38087b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.069012 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fe2799e-2d16-4569-874b-a0066d38087b-kube-api-access-pdj8f" (OuterVolumeSpecName: "kube-api-access-pdj8f") pod "2fe2799e-2d16-4569-874b-a0066d38087b" (UID: "2fe2799e-2d16-4569-874b-a0066d38087b"). InnerVolumeSpecName "kube-api-access-pdj8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.094900 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe2799e-2d16-4569-874b-a0066d38087b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fe2799e-2d16-4569-874b-a0066d38087b" (UID: "2fe2799e-2d16-4569-874b-a0066d38087b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.101871 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe2799e-2d16-4569-874b-a0066d38087b-config-data" (OuterVolumeSpecName: "config-data") pod "2fe2799e-2d16-4569-874b-a0066d38087b" (UID: "2fe2799e-2d16-4569-874b-a0066d38087b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.163500 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fe2799e-2d16-4569-874b-a0066d38087b-logs\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.163558 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdj8f\" (UniqueName: \"kubernetes.io/projected/2fe2799e-2d16-4569-874b-a0066d38087b-kube-api-access-pdj8f\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.163571 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe2799e-2d16-4569-874b-a0066d38087b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.163581 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe2799e-2d16-4569-874b-a0066d38087b-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.358531 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91f1949a-9307-4999-b50c-2d5a747e4571" path="/var/lib/kubelet/pods/91f1949a-9307-4999-b50c-2d5a747e4571/volumes" Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.973830 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fe2799e-2d16-4569-874b-a0066d38087b","Type":"ContainerDied","Data":"e3f5c2b4235a2950e0e59795cc7b8912b807d9f4fd5a548962dea30abd744634"} Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.973860 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.973884 4776 scope.go:117] "RemoveContainer" containerID="3d0078c422c14eb90a83825f183764ed8c85b60f82f3c7def9c1d237c12d5859" Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.977693 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-kdpqs" event={"ID":"bc988dd0-b7f8-4793-8922-238ec7c3081b","Type":"ContainerStarted","Data":"aa98e5947586fa94673e30b99b6340d6bed0a6f212e05e6655444f8f45cbe1af"} Dec 08 09:25:54 crc kubenswrapper[4776]: I1208 09:25:54.983730 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2","Type":"ContainerStarted","Data":"6bc1cc169f8425246640502113f3fac50a76b9053faab1783a66c831bd0f556f"} Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.100817 4776 scope.go:117] "RemoveContainer" containerID="2e51c1ea01b38323d942e9c96fc1a6ccdffedfd437e17ebbca279d55f9072135" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.127304 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.151434 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.160038 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 08 09:25:55 crc kubenswrapper[4776]: E1208 09:25:55.160682 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe2799e-2d16-4569-874b-a0066d38087b" containerName="nova-api-log" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.160701 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe2799e-2d16-4569-874b-a0066d38087b" containerName="nova-api-log" Dec 08 09:25:55 crc kubenswrapper[4776]: E1208 09:25:55.160717 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe2799e-2d16-4569-874b-a0066d38087b" containerName="nova-api-api" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.160725 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe2799e-2d16-4569-874b-a0066d38087b" containerName="nova-api-api" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.161026 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fe2799e-2d16-4569-874b-a0066d38087b" containerName="nova-api-log" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.161066 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fe2799e-2d16-4569-874b-a0066d38087b" containerName="nova-api-api" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.165451 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.168368 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.168765 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.168859 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.189109 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.289119 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.289279 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpv2w\" (UniqueName: \"kubernetes.io/projected/9434f83d-ef01-4184-b7e7-e26d478e0589-kube-api-access-dpv2w\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.289482 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-config-data\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.289535 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9434f83d-ef01-4184-b7e7-e26d478e0589-logs\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.289594 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-public-tls-certs\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.289764 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.391831 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpv2w\" (UniqueName: \"kubernetes.io/projected/9434f83d-ef01-4184-b7e7-e26d478e0589-kube-api-access-dpv2w\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.391938 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-config-data\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.391962 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9434f83d-ef01-4184-b7e7-e26d478e0589-logs\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.391984 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-public-tls-certs\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.392043 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.392103 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.392577 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9434f83d-ef01-4184-b7e7-e26d478e0589-logs\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.405266 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-config-data\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.406445 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-public-tls-certs\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.407088 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.413747 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpv2w\" (UniqueName: \"kubernetes.io/projected/9434f83d-ef01-4184-b7e7-e26d478e0589-kube-api-access-dpv2w\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.413991 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " pod="openstack/nova-api-0" Dec 08 09:25:55 crc kubenswrapper[4776]: I1208 09:25:55.519525 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 09:25:56 crc kubenswrapper[4776]: I1208 09:25:56.018150 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2","Type":"ContainerStarted","Data":"3e37b1b91d4df4b5b7adcfd70975fc7812e3ea603ba6ebf17cc7e99bdcfedd83"} Dec 08 09:25:56 crc kubenswrapper[4776]: I1208 09:25:56.167995 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:25:56 crc kubenswrapper[4776]: I1208 09:25:56.367595 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fe2799e-2d16-4569-874b-a0066d38087b" path="/var/lib/kubelet/pods/2fe2799e-2d16-4569-874b-a0066d38087b/volumes" Dec 08 09:25:57 crc kubenswrapper[4776]: I1208 09:25:57.032604 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2","Type":"ContainerStarted","Data":"f4b88adf886f9ee8dab76fc9ad32d403a2580f6e62e68801eaf20a3fa8877c2f"} Dec 08 09:25:57 crc kubenswrapper[4776]: I1208 09:25:57.035397 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9434f83d-ef01-4184-b7e7-e26d478e0589","Type":"ContainerStarted","Data":"d15ee32adc3021161efbfda2dd6fadba3087d419a555479d89d67fdbd80cd7d4"} Dec 08 09:25:57 crc kubenswrapper[4776]: I1208 09:25:57.035518 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9434f83d-ef01-4184-b7e7-e26d478e0589","Type":"ContainerStarted","Data":"1ce0d2f9912b96a573b4080fb27ea612f781b386667efb8938d9f4855b19e92f"} Dec 08 09:25:57 crc kubenswrapper[4776]: I1208 09:25:57.035583 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9434f83d-ef01-4184-b7e7-e26d478e0589","Type":"ContainerStarted","Data":"0455a5a8e46fb353a1d567c622a1c9d0df627d6a4e850008cbfe73d9705e40ef"} Dec 08 09:25:57 crc kubenswrapper[4776]: I1208 09:25:57.054534 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.054519554 podStartE2EDuration="2.054519554s" podCreationTimestamp="2025-12-08 09:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:25:57.053025233 +0000 UTC m=+1633.316250265" watchObservedRunningTime="2025-12-08 09:25:57.054519554 +0000 UTC m=+1633.317744576" Dec 08 09:25:57 crc kubenswrapper[4776]: I1208 09:25:57.452155 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:25:57 crc kubenswrapper[4776]: I1208 09:25:57.533131 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-hzfv9"] Dec 08 09:25:57 crc kubenswrapper[4776]: I1208 09:25:57.533658 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" podUID="474cb911-9e81-43ed-a828-52d9f03eb4df" containerName="dnsmasq-dns" containerID="cri-o://25ec6d64e4d550aad188f5bf9c1c6906cd4de1837719de1cdc3e17d679d5b853" gracePeriod=10 Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.067948 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2","Type":"ContainerStarted","Data":"5ad89ed0dad1935ccb668b6aa868170d5ae5c915646c7490e3ca8111b36fd4d7"} Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.068117 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerName="ceilometer-central-agent" containerID="cri-o://6bc1cc169f8425246640502113f3fac50a76b9053faab1783a66c831bd0f556f" gracePeriod=30 Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.068409 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.068741 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerName="proxy-httpd" containerID="cri-o://5ad89ed0dad1935ccb668b6aa868170d5ae5c915646c7490e3ca8111b36fd4d7" gracePeriod=30 Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.068787 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerName="sg-core" containerID="cri-o://f4b88adf886f9ee8dab76fc9ad32d403a2580f6e62e68801eaf20a3fa8877c2f" gracePeriod=30 Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.068819 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerName="ceilometer-notification-agent" containerID="cri-o://3e37b1b91d4df4b5b7adcfd70975fc7812e3ea603ba6ebf17cc7e99bdcfedd83" gracePeriod=30 Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.079725 4776 generic.go:334] "Generic (PLEG): container finished" podID="474cb911-9e81-43ed-a828-52d9f03eb4df" containerID="25ec6d64e4d550aad188f5bf9c1c6906cd4de1837719de1cdc3e17d679d5b853" exitCode=0 Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.081941 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" event={"ID":"474cb911-9e81-43ed-a828-52d9f03eb4df","Type":"ContainerDied","Data":"25ec6d64e4d550aad188f5bf9c1c6906cd4de1837719de1cdc3e17d679d5b853"} Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.104915 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.587894459 podStartE2EDuration="6.104896315s" podCreationTimestamp="2025-12-08 09:25:52 +0000 UTC" firstStartedPulling="2025-12-08 09:25:53.839408492 +0000 UTC m=+1630.102633514" lastFinishedPulling="2025-12-08 09:25:57.356410328 +0000 UTC m=+1633.619635370" observedRunningTime="2025-12-08 09:25:58.096745735 +0000 UTC m=+1634.359970777" watchObservedRunningTime="2025-12-08 09:25:58.104896315 +0000 UTC m=+1634.368121337" Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.299492 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.375796 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-dns-svc\") pod \"474cb911-9e81-43ed-a828-52d9f03eb4df\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.375971 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk7px\" (UniqueName: \"kubernetes.io/projected/474cb911-9e81-43ed-a828-52d9f03eb4df-kube-api-access-fk7px\") pod \"474cb911-9e81-43ed-a828-52d9f03eb4df\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.376006 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-ovsdbserver-nb\") pod \"474cb911-9e81-43ed-a828-52d9f03eb4df\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.376044 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-ovsdbserver-sb\") pod \"474cb911-9e81-43ed-a828-52d9f03eb4df\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.376161 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-dns-swift-storage-0\") pod \"474cb911-9e81-43ed-a828-52d9f03eb4df\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.376282 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-config\") pod \"474cb911-9e81-43ed-a828-52d9f03eb4df\" (UID: \"474cb911-9e81-43ed-a828-52d9f03eb4df\") " Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.407418 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/474cb911-9e81-43ed-a828-52d9f03eb4df-kube-api-access-fk7px" (OuterVolumeSpecName: "kube-api-access-fk7px") pod "474cb911-9e81-43ed-a828-52d9f03eb4df" (UID: "474cb911-9e81-43ed-a828-52d9f03eb4df"). InnerVolumeSpecName "kube-api-access-fk7px". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.435840 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "474cb911-9e81-43ed-a828-52d9f03eb4df" (UID: "474cb911-9e81-43ed-a828-52d9f03eb4df"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.464690 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "474cb911-9e81-43ed-a828-52d9f03eb4df" (UID: "474cb911-9e81-43ed-a828-52d9f03eb4df"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.471130 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-config" (OuterVolumeSpecName: "config") pod "474cb911-9e81-43ed-a828-52d9f03eb4df" (UID: "474cb911-9e81-43ed-a828-52d9f03eb4df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.479519 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk7px\" (UniqueName: \"kubernetes.io/projected/474cb911-9e81-43ed-a828-52d9f03eb4df-kube-api-access-fk7px\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.479556 4776 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.479571 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.479583 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.481729 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "474cb911-9e81-43ed-a828-52d9f03eb4df" (UID: "474cb911-9e81-43ed-a828-52d9f03eb4df"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.493534 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "474cb911-9e81-43ed-a828-52d9f03eb4df" (UID: "474cb911-9e81-43ed-a828-52d9f03eb4df"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.581661 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:58 crc kubenswrapper[4776]: I1208 09:25:58.581694 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/474cb911-9e81-43ed-a828-52d9f03eb4df-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 09:25:59 crc kubenswrapper[4776]: I1208 09:25:59.092257 4776 generic.go:334] "Generic (PLEG): container finished" podID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerID="5ad89ed0dad1935ccb668b6aa868170d5ae5c915646c7490e3ca8111b36fd4d7" exitCode=0 Dec 08 09:25:59 crc kubenswrapper[4776]: I1208 09:25:59.092288 4776 generic.go:334] "Generic (PLEG): container finished" podID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerID="f4b88adf886f9ee8dab76fc9ad32d403a2580f6e62e68801eaf20a3fa8877c2f" exitCode=2 Dec 08 09:25:59 crc kubenswrapper[4776]: I1208 09:25:59.092297 4776 generic.go:334] "Generic (PLEG): container finished" podID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerID="3e37b1b91d4df4b5b7adcfd70975fc7812e3ea603ba6ebf17cc7e99bdcfedd83" exitCode=0 Dec 08 09:25:59 crc kubenswrapper[4776]: I1208 09:25:59.092302 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2","Type":"ContainerDied","Data":"5ad89ed0dad1935ccb668b6aa868170d5ae5c915646c7490e3ca8111b36fd4d7"} Dec 08 09:25:59 crc kubenswrapper[4776]: I1208 09:25:59.092345 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2","Type":"ContainerDied","Data":"f4b88adf886f9ee8dab76fc9ad32d403a2580f6e62e68801eaf20a3fa8877c2f"} Dec 08 09:25:59 crc kubenswrapper[4776]: I1208 09:25:59.092355 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2","Type":"ContainerDied","Data":"3e37b1b91d4df4b5b7adcfd70975fc7812e3ea603ba6ebf17cc7e99bdcfedd83"} Dec 08 09:25:59 crc kubenswrapper[4776]: I1208 09:25:59.094164 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" event={"ID":"474cb911-9e81-43ed-a828-52d9f03eb4df","Type":"ContainerDied","Data":"7fe3f2599b53a869df7e6c2dcb468e6f3ae530cb00400251edb047dc618c43c4"} Dec 08 09:25:59 crc kubenswrapper[4776]: I1208 09:25:59.094222 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-hzfv9" Dec 08 09:25:59 crc kubenswrapper[4776]: I1208 09:25:59.094240 4776 scope.go:117] "RemoveContainer" containerID="25ec6d64e4d550aad188f5bf9c1c6906cd4de1837719de1cdc3e17d679d5b853" Dec 08 09:25:59 crc kubenswrapper[4776]: I1208 09:25:59.114105 4776 scope.go:117] "RemoveContainer" containerID="09604120125d21ca38bf66a4b036ff07206fa8247d1f249d62a02d93c94ef6a7" Dec 08 09:25:59 crc kubenswrapper[4776]: I1208 09:25:59.178905 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-hzfv9"] Dec 08 09:25:59 crc kubenswrapper[4776]: I1208 09:25:59.189572 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-hzfv9"] Dec 08 09:25:59 crc kubenswrapper[4776]: I1208 09:25:59.343818 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:25:59 crc kubenswrapper[4776]: E1208 09:25:59.344094 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:26:00 crc kubenswrapper[4776]: I1208 09:26:00.108599 4776 generic.go:334] "Generic (PLEG): container finished" podID="bc988dd0-b7f8-4793-8922-238ec7c3081b" containerID="aa98e5947586fa94673e30b99b6340d6bed0a6f212e05e6655444f8f45cbe1af" exitCode=0 Dec 08 09:26:00 crc kubenswrapper[4776]: I1208 09:26:00.108732 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-kdpqs" event={"ID":"bc988dd0-b7f8-4793-8922-238ec7c3081b","Type":"ContainerDied","Data":"aa98e5947586fa94673e30b99b6340d6bed0a6f212e05e6655444f8f45cbe1af"} Dec 08 09:26:00 crc kubenswrapper[4776]: I1208 09:26:00.363607 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="474cb911-9e81-43ed-a828-52d9f03eb4df" path="/var/lib/kubelet/pods/474cb911-9e81-43ed-a828-52d9f03eb4df/volumes" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.613840 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-kdpqs" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.665192 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-config-data\") pod \"bc988dd0-b7f8-4793-8922-238ec7c3081b\" (UID: \"bc988dd0-b7f8-4793-8922-238ec7c3081b\") " Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.665583 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-combined-ca-bundle\") pod \"bc988dd0-b7f8-4793-8922-238ec7c3081b\" (UID: \"bc988dd0-b7f8-4793-8922-238ec7c3081b\") " Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.666167 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-scripts\") pod \"bc988dd0-b7f8-4793-8922-238ec7c3081b\" (UID: \"bc988dd0-b7f8-4793-8922-238ec7c3081b\") " Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.666227 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpkfg\" (UniqueName: \"kubernetes.io/projected/bc988dd0-b7f8-4793-8922-238ec7c3081b-kube-api-access-cpkfg\") pod \"bc988dd0-b7f8-4793-8922-238ec7c3081b\" (UID: \"bc988dd0-b7f8-4793-8922-238ec7c3081b\") " Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.675816 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc988dd0-b7f8-4793-8922-238ec7c3081b-kube-api-access-cpkfg" (OuterVolumeSpecName: "kube-api-access-cpkfg") pod "bc988dd0-b7f8-4793-8922-238ec7c3081b" (UID: "bc988dd0-b7f8-4793-8922-238ec7c3081b"). InnerVolumeSpecName "kube-api-access-cpkfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.675875 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-scripts" (OuterVolumeSpecName: "scripts") pod "bc988dd0-b7f8-4793-8922-238ec7c3081b" (UID: "bc988dd0-b7f8-4793-8922-238ec7c3081b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.699533 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-config-data" (OuterVolumeSpecName: "config-data") pod "bc988dd0-b7f8-4793-8922-238ec7c3081b" (UID: "bc988dd0-b7f8-4793-8922-238ec7c3081b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.703289 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc988dd0-b7f8-4793-8922-238ec7c3081b" (UID: "bc988dd0-b7f8-4793-8922-238ec7c3081b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.769285 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.769318 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpkfg\" (UniqueName: \"kubernetes.io/projected/bc988dd0-b7f8-4793-8922-238ec7c3081b-kube-api-access-cpkfg\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.769328 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.769337 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc988dd0-b7f8-4793-8922-238ec7c3081b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.785623 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.871080 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tg2x\" (UniqueName: \"kubernetes.io/projected/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-kube-api-access-9tg2x\") pod \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.871131 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-sg-core-conf-yaml\") pod \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.871157 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-scripts\") pod \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.871287 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-config-data\") pod \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.871387 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-run-httpd\") pod \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.871446 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-combined-ca-bundle\") pod \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.871506 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-log-httpd\") pod \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\" (UID: \"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2\") " Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.872025 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" (UID: "4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.872490 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" (UID: "4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.875341 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-scripts" (OuterVolumeSpecName: "scripts") pod "4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" (UID: "4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.875910 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-kube-api-access-9tg2x" (OuterVolumeSpecName: "kube-api-access-9tg2x") pod "4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" (UID: "4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2"). InnerVolumeSpecName "kube-api-access-9tg2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.901456 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" (UID: "4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.954858 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" (UID: "4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.974347 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.974391 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.974403 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.974416 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tg2x\" (UniqueName: \"kubernetes.io/projected/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-kube-api-access-9tg2x\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.974428 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.974439 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:01 crc kubenswrapper[4776]: I1208 09:26:01.985658 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-config-data" (OuterVolumeSpecName: "config-data") pod "4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" (UID: "4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.077135 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.144240 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-kdpqs" event={"ID":"bc988dd0-b7f8-4793-8922-238ec7c3081b","Type":"ContainerDied","Data":"2df6524946b75a97f411eca2a816359eeaeed3ff3ede4991c33760287b254865"} Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.144284 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2df6524946b75a97f411eca2a816359eeaeed3ff3ede4991c33760287b254865" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.144295 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-kdpqs" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.148868 4776 generic.go:334] "Generic (PLEG): container finished" podID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerID="6bc1cc169f8425246640502113f3fac50a76b9053faab1783a66c831bd0f556f" exitCode=0 Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.148906 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2","Type":"ContainerDied","Data":"6bc1cc169f8425246640502113f3fac50a76b9053faab1783a66c831bd0f556f"} Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.148932 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2","Type":"ContainerDied","Data":"2f4a7f03efa29b59d81df582b5c478847c5d89a4bb37990aa5e85d63a0ad4344"} Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.148953 4776 scope.go:117] "RemoveContainer" containerID="5ad89ed0dad1935ccb668b6aa868170d5ae5c915646c7490e3ca8111b36fd4d7" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.149098 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.198546 4776 scope.go:117] "RemoveContainer" containerID="f4b88adf886f9ee8dab76fc9ad32d403a2580f6e62e68801eaf20a3fa8877c2f" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.224125 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.236083 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.247837 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:02 crc kubenswrapper[4776]: E1208 09:26:02.248508 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerName="proxy-httpd" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.248541 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerName="proxy-httpd" Dec 08 09:26:02 crc kubenswrapper[4776]: E1208 09:26:02.248568 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="474cb911-9e81-43ed-a828-52d9f03eb4df" containerName="dnsmasq-dns" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.248581 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="474cb911-9e81-43ed-a828-52d9f03eb4df" containerName="dnsmasq-dns" Dec 08 09:26:02 crc kubenswrapper[4776]: E1208 09:26:02.248610 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="474cb911-9e81-43ed-a828-52d9f03eb4df" containerName="init" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.248622 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="474cb911-9e81-43ed-a828-52d9f03eb4df" containerName="init" Dec 08 09:26:02 crc kubenswrapper[4776]: E1208 09:26:02.248650 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerName="ceilometer-central-agent" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.248661 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerName="ceilometer-central-agent" Dec 08 09:26:02 crc kubenswrapper[4776]: E1208 09:26:02.248688 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerName="ceilometer-notification-agent" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.248702 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerName="ceilometer-notification-agent" Dec 08 09:26:02 crc kubenswrapper[4776]: E1208 09:26:02.248743 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc988dd0-b7f8-4793-8922-238ec7c3081b" containerName="nova-manage" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.248755 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc988dd0-b7f8-4793-8922-238ec7c3081b" containerName="nova-manage" Dec 08 09:26:02 crc kubenswrapper[4776]: E1208 09:26:02.248781 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerName="sg-core" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.248791 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerName="sg-core" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.249128 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerName="ceilometer-central-agent" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.249192 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerName="proxy-httpd" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.249215 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerName="ceilometer-notification-agent" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.249237 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc988dd0-b7f8-4793-8922-238ec7c3081b" containerName="nova-manage" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.249261 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="474cb911-9e81-43ed-a828-52d9f03eb4df" containerName="dnsmasq-dns" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.249294 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" containerName="sg-core" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.252650 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.255652 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.255804 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.260921 4776 scope.go:117] "RemoveContainer" containerID="3e37b1b91d4df4b5b7adcfd70975fc7812e3ea603ba6ebf17cc7e99bdcfedd83" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.261357 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.374398 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2" path="/var/lib/kubelet/pods/4bb5f9b1-fbbf-4950-a31d-8de3f994ddd2/volumes" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.375565 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.375656 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.375894 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9434f83d-ef01-4184-b7e7-e26d478e0589" containerName="nova-api-log" containerID="cri-o://1ce0d2f9912b96a573b4080fb27ea612f781b386667efb8938d9f4855b19e92f" gracePeriod=30 Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.376028 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f8647023-4573-46b1-a713-c153d75d160b" containerName="nova-metadata-log" containerID="cri-o://2be87160fc4bd4b7e818817c80fc6b49383cb19dc8120ad2c77d39aa32b0bea8" gracePeriod=30 Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.376076 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9434f83d-ef01-4184-b7e7-e26d478e0589" containerName="nova-api-api" containerID="cri-o://d15ee32adc3021161efbfda2dd6fadba3087d419a555479d89d67fdbd80cd7d4" gracePeriod=30 Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.376289 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f8647023-4573-46b1-a713-c153d75d160b" containerName="nova-metadata-metadata" containerID="cri-o://79f3a895f3e200395ad04fd83a6567b267b9c69b94cebcb3229f29398657dd93" gracePeriod=30 Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.383698 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.383832 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.383915 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjhcp\" (UniqueName: \"kubernetes.io/projected/a9beda06-9357-4e94-a243-78484ede0b97-kube-api-access-mjhcp\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.384008 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9beda06-9357-4e94-a243-78484ede0b97-run-httpd\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.384137 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-scripts\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.384237 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9beda06-9357-4e94-a243-78484ede0b97-log-httpd\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.384378 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-config-data\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.384006 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.384688 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ff46e335-2c8f-4011-85ac-de45611f8e45" containerName="nova-scheduler-scheduler" containerID="cri-o://b65eae25ec32cadfd6e7546c1297efa384bb5d0626ffa9c5dcb1b9ebe1dcb88d" gracePeriod=30 Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.409002 4776 scope.go:117] "RemoveContainer" containerID="6bc1cc169f8425246640502113f3fac50a76b9053faab1783a66c831bd0f556f" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.443774 4776 scope.go:117] "RemoveContainer" containerID="5ad89ed0dad1935ccb668b6aa868170d5ae5c915646c7490e3ca8111b36fd4d7" Dec 08 09:26:02 crc kubenswrapper[4776]: E1208 09:26:02.444441 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ad89ed0dad1935ccb668b6aa868170d5ae5c915646c7490e3ca8111b36fd4d7\": container with ID starting with 5ad89ed0dad1935ccb668b6aa868170d5ae5c915646c7490e3ca8111b36fd4d7 not found: ID does not exist" containerID="5ad89ed0dad1935ccb668b6aa868170d5ae5c915646c7490e3ca8111b36fd4d7" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.444528 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ad89ed0dad1935ccb668b6aa868170d5ae5c915646c7490e3ca8111b36fd4d7"} err="failed to get container status \"5ad89ed0dad1935ccb668b6aa868170d5ae5c915646c7490e3ca8111b36fd4d7\": rpc error: code = NotFound desc = could not find container \"5ad89ed0dad1935ccb668b6aa868170d5ae5c915646c7490e3ca8111b36fd4d7\": container with ID starting with 5ad89ed0dad1935ccb668b6aa868170d5ae5c915646c7490e3ca8111b36fd4d7 not found: ID does not exist" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.444599 4776 scope.go:117] "RemoveContainer" containerID="f4b88adf886f9ee8dab76fc9ad32d403a2580f6e62e68801eaf20a3fa8877c2f" Dec 08 09:26:02 crc kubenswrapper[4776]: E1208 09:26:02.444978 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4b88adf886f9ee8dab76fc9ad32d403a2580f6e62e68801eaf20a3fa8877c2f\": container with ID starting with f4b88adf886f9ee8dab76fc9ad32d403a2580f6e62e68801eaf20a3fa8877c2f not found: ID does not exist" containerID="f4b88adf886f9ee8dab76fc9ad32d403a2580f6e62e68801eaf20a3fa8877c2f" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.445063 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4b88adf886f9ee8dab76fc9ad32d403a2580f6e62e68801eaf20a3fa8877c2f"} err="failed to get container status \"f4b88adf886f9ee8dab76fc9ad32d403a2580f6e62e68801eaf20a3fa8877c2f\": rpc error: code = NotFound desc = could not find container \"f4b88adf886f9ee8dab76fc9ad32d403a2580f6e62e68801eaf20a3fa8877c2f\": container with ID starting with f4b88adf886f9ee8dab76fc9ad32d403a2580f6e62e68801eaf20a3fa8877c2f not found: ID does not exist" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.445131 4776 scope.go:117] "RemoveContainer" containerID="3e37b1b91d4df4b5b7adcfd70975fc7812e3ea603ba6ebf17cc7e99bdcfedd83" Dec 08 09:26:02 crc kubenswrapper[4776]: E1208 09:26:02.448330 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e37b1b91d4df4b5b7adcfd70975fc7812e3ea603ba6ebf17cc7e99bdcfedd83\": container with ID starting with 3e37b1b91d4df4b5b7adcfd70975fc7812e3ea603ba6ebf17cc7e99bdcfedd83 not found: ID does not exist" containerID="3e37b1b91d4df4b5b7adcfd70975fc7812e3ea603ba6ebf17cc7e99bdcfedd83" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.448446 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e37b1b91d4df4b5b7adcfd70975fc7812e3ea603ba6ebf17cc7e99bdcfedd83"} err="failed to get container status \"3e37b1b91d4df4b5b7adcfd70975fc7812e3ea603ba6ebf17cc7e99bdcfedd83\": rpc error: code = NotFound desc = could not find container \"3e37b1b91d4df4b5b7adcfd70975fc7812e3ea603ba6ebf17cc7e99bdcfedd83\": container with ID starting with 3e37b1b91d4df4b5b7adcfd70975fc7812e3ea603ba6ebf17cc7e99bdcfedd83 not found: ID does not exist" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.448511 4776 scope.go:117] "RemoveContainer" containerID="6bc1cc169f8425246640502113f3fac50a76b9053faab1783a66c831bd0f556f" Dec 08 09:26:02 crc kubenswrapper[4776]: E1208 09:26:02.449881 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bc1cc169f8425246640502113f3fac50a76b9053faab1783a66c831bd0f556f\": container with ID starting with 6bc1cc169f8425246640502113f3fac50a76b9053faab1783a66c831bd0f556f not found: ID does not exist" containerID="6bc1cc169f8425246640502113f3fac50a76b9053faab1783a66c831bd0f556f" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.450018 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bc1cc169f8425246640502113f3fac50a76b9053faab1783a66c831bd0f556f"} err="failed to get container status \"6bc1cc169f8425246640502113f3fac50a76b9053faab1783a66c831bd0f556f\": rpc error: code = NotFound desc = could not find container \"6bc1cc169f8425246640502113f3fac50a76b9053faab1783a66c831bd0f556f\": container with ID starting with 6bc1cc169f8425246640502113f3fac50a76b9053faab1783a66c831bd0f556f not found: ID does not exist" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.486568 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-scripts\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.486768 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9beda06-9357-4e94-a243-78484ede0b97-log-httpd\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.486963 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-config-data\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.487071 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.487156 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.487263 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjhcp\" (UniqueName: \"kubernetes.io/projected/a9beda06-9357-4e94-a243-78484ede0b97-kube-api-access-mjhcp\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.487496 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9beda06-9357-4e94-a243-78484ede0b97-run-httpd\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.488099 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9beda06-9357-4e94-a243-78484ede0b97-run-httpd\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.488877 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9beda06-9357-4e94-a243-78484ede0b97-log-httpd\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.498683 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-scripts\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.499872 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.500924 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.516154 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-config-data\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.518990 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjhcp\" (UniqueName: \"kubernetes.io/projected/a9beda06-9357-4e94-a243-78484ede0b97-kube-api-access-mjhcp\") pod \"ceilometer-0\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " pod="openstack/ceilometer-0" Dec 08 09:26:02 crc kubenswrapper[4776]: I1208 09:26:02.667083 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.163252 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.166824 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8647023-4573-46b1-a713-c153d75d160b","Type":"ContainerDied","Data":"2be87160fc4bd4b7e818817c80fc6b49383cb19dc8120ad2c77d39aa32b0bea8"} Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.163329 4776 generic.go:334] "Generic (PLEG): container finished" podID="f8647023-4573-46b1-a713-c153d75d160b" containerID="2be87160fc4bd4b7e818817c80fc6b49383cb19dc8120ad2c77d39aa32b0bea8" exitCode=143 Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.175690 4776 generic.go:334] "Generic (PLEG): container finished" podID="9434f83d-ef01-4184-b7e7-e26d478e0589" containerID="d15ee32adc3021161efbfda2dd6fadba3087d419a555479d89d67fdbd80cd7d4" exitCode=0 Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.175713 4776 generic.go:334] "Generic (PLEG): container finished" podID="9434f83d-ef01-4184-b7e7-e26d478e0589" containerID="1ce0d2f9912b96a573b4080fb27ea612f781b386667efb8938d9f4855b19e92f" exitCode=143 Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.175762 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9434f83d-ef01-4184-b7e7-e26d478e0589","Type":"ContainerDied","Data":"d15ee32adc3021161efbfda2dd6fadba3087d419a555479d89d67fdbd80cd7d4"} Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.175789 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9434f83d-ef01-4184-b7e7-e26d478e0589","Type":"ContainerDied","Data":"1ce0d2f9912b96a573b4080fb27ea612f781b386667efb8938d9f4855b19e92f"} Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.232090 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.308213 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9434f83d-ef01-4184-b7e7-e26d478e0589-logs\") pod \"9434f83d-ef01-4184-b7e7-e26d478e0589\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.308536 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpv2w\" (UniqueName: \"kubernetes.io/projected/9434f83d-ef01-4184-b7e7-e26d478e0589-kube-api-access-dpv2w\") pod \"9434f83d-ef01-4184-b7e7-e26d478e0589\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.308661 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-public-tls-certs\") pod \"9434f83d-ef01-4184-b7e7-e26d478e0589\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.308761 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-config-data\") pod \"9434f83d-ef01-4184-b7e7-e26d478e0589\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.308800 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9434f83d-ef01-4184-b7e7-e26d478e0589-logs" (OuterVolumeSpecName: "logs") pod "9434f83d-ef01-4184-b7e7-e26d478e0589" (UID: "9434f83d-ef01-4184-b7e7-e26d478e0589"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.308885 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-internal-tls-certs\") pod \"9434f83d-ef01-4184-b7e7-e26d478e0589\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.308984 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-combined-ca-bundle\") pod \"9434f83d-ef01-4184-b7e7-e26d478e0589\" (UID: \"9434f83d-ef01-4184-b7e7-e26d478e0589\") " Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.309613 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9434f83d-ef01-4184-b7e7-e26d478e0589-logs\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.315717 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9434f83d-ef01-4184-b7e7-e26d478e0589-kube-api-access-dpv2w" (OuterVolumeSpecName: "kube-api-access-dpv2w") pod "9434f83d-ef01-4184-b7e7-e26d478e0589" (UID: "9434f83d-ef01-4184-b7e7-e26d478e0589"). InnerVolumeSpecName "kube-api-access-dpv2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.344460 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-config-data" (OuterVolumeSpecName: "config-data") pod "9434f83d-ef01-4184-b7e7-e26d478e0589" (UID: "9434f83d-ef01-4184-b7e7-e26d478e0589"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.347342 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9434f83d-ef01-4184-b7e7-e26d478e0589" (UID: "9434f83d-ef01-4184-b7e7-e26d478e0589"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.377512 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9434f83d-ef01-4184-b7e7-e26d478e0589" (UID: "9434f83d-ef01-4184-b7e7-e26d478e0589"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.389393 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9434f83d-ef01-4184-b7e7-e26d478e0589" (UID: "9434f83d-ef01-4184-b7e7-e26d478e0589"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.412731 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpv2w\" (UniqueName: \"kubernetes.io/projected/9434f83d-ef01-4184-b7e7-e26d478e0589-kube-api-access-dpv2w\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.412759 4776 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.412769 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.412778 4776 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:03 crc kubenswrapper[4776]: I1208 09:26:03.412786 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9434f83d-ef01-4184-b7e7-e26d478e0589-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.213616 4776 generic.go:334] "Generic (PLEG): container finished" podID="ff46e335-2c8f-4011-85ac-de45611f8e45" containerID="b65eae25ec32cadfd6e7546c1297efa384bb5d0626ffa9c5dcb1b9ebe1dcb88d" exitCode=0 Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.213957 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff46e335-2c8f-4011-85ac-de45611f8e45","Type":"ContainerDied","Data":"b65eae25ec32cadfd6e7546c1297efa384bb5d0626ffa9c5dcb1b9ebe1dcb88d"} Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.217818 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9434f83d-ef01-4184-b7e7-e26d478e0589","Type":"ContainerDied","Data":"0455a5a8e46fb353a1d567c622a1c9d0df627d6a4e850008cbfe73d9705e40ef"} Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.218101 4776 scope.go:117] "RemoveContainer" containerID="d15ee32adc3021161efbfda2dd6fadba3087d419a555479d89d67fdbd80cd7d4" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.218289 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.227562 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9beda06-9357-4e94-a243-78484ede0b97","Type":"ContainerStarted","Data":"e9bd9c9a55a952fafee823ba565ef3ae31d83086f77f47611292b05ab8a8cb79"} Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.227614 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9beda06-9357-4e94-a243-78484ede0b97","Type":"ContainerStarted","Data":"809c2d04a826e03946dedbf3ca8787df7827d029642207128a013ffb7812a07f"} Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.273237 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.295239 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.302275 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 08 09:26:04 crc kubenswrapper[4776]: E1208 09:26:04.303166 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9434f83d-ef01-4184-b7e7-e26d478e0589" containerName="nova-api-api" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.303213 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9434f83d-ef01-4184-b7e7-e26d478e0589" containerName="nova-api-api" Dec 08 09:26:04 crc kubenswrapper[4776]: E1208 09:26:04.303244 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9434f83d-ef01-4184-b7e7-e26d478e0589" containerName="nova-api-log" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.303253 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9434f83d-ef01-4184-b7e7-e26d478e0589" containerName="nova-api-log" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.303553 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9434f83d-ef01-4184-b7e7-e26d478e0589" containerName="nova-api-api" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.303590 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9434f83d-ef01-4184-b7e7-e26d478e0589" containerName="nova-api-log" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.305310 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.314340 4776 scope.go:117] "RemoveContainer" containerID="1ce0d2f9912b96a573b4080fb27ea612f781b386667efb8938d9f4855b19e92f" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.314937 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.315206 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.315383 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.326091 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.334296 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffrcb\" (UniqueName: \"kubernetes.io/projected/56c71de4-c00f-47d6-87d7-c5eb97b88eef-kube-api-access-ffrcb\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.334622 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c71de4-c00f-47d6-87d7-c5eb97b88eef-config-data\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.334738 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c71de4-c00f-47d6-87d7-c5eb97b88eef-public-tls-certs\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.335113 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56c71de4-c00f-47d6-87d7-c5eb97b88eef-logs\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.337267 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c71de4-c00f-47d6-87d7-c5eb97b88eef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.337401 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c71de4-c00f-47d6-87d7-c5eb97b88eef-internal-tls-certs\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.372267 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9434f83d-ef01-4184-b7e7-e26d478e0589" path="/var/lib/kubelet/pods/9434f83d-ef01-4184-b7e7-e26d478e0589/volumes" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.439636 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56c71de4-c00f-47d6-87d7-c5eb97b88eef-logs\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.439737 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c71de4-c00f-47d6-87d7-c5eb97b88eef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.439801 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c71de4-c00f-47d6-87d7-c5eb97b88eef-internal-tls-certs\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.440479 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffrcb\" (UniqueName: \"kubernetes.io/projected/56c71de4-c00f-47d6-87d7-c5eb97b88eef-kube-api-access-ffrcb\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.440859 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56c71de4-c00f-47d6-87d7-c5eb97b88eef-logs\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.441039 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c71de4-c00f-47d6-87d7-c5eb97b88eef-config-data\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.441454 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c71de4-c00f-47d6-87d7-c5eb97b88eef-public-tls-certs\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.443769 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c71de4-c00f-47d6-87d7-c5eb97b88eef-internal-tls-certs\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.444035 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c71de4-c00f-47d6-87d7-c5eb97b88eef-public-tls-certs\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.444275 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c71de4-c00f-47d6-87d7-c5eb97b88eef-config-data\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.449108 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c71de4-c00f-47d6-87d7-c5eb97b88eef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.461629 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffrcb\" (UniqueName: \"kubernetes.io/projected/56c71de4-c00f-47d6-87d7-c5eb97b88eef-kube-api-access-ffrcb\") pod \"nova-api-0\" (UID: \"56c71de4-c00f-47d6-87d7-c5eb97b88eef\") " pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.575756 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.579429 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.645476 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff46e335-2c8f-4011-85ac-de45611f8e45-combined-ca-bundle\") pod \"ff46e335-2c8f-4011-85ac-de45611f8e45\" (UID: \"ff46e335-2c8f-4011-85ac-de45611f8e45\") " Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.645593 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49bqs\" (UniqueName: \"kubernetes.io/projected/ff46e335-2c8f-4011-85ac-de45611f8e45-kube-api-access-49bqs\") pod \"ff46e335-2c8f-4011-85ac-de45611f8e45\" (UID: \"ff46e335-2c8f-4011-85ac-de45611f8e45\") " Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.645767 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff46e335-2c8f-4011-85ac-de45611f8e45-config-data\") pod \"ff46e335-2c8f-4011-85ac-de45611f8e45\" (UID: \"ff46e335-2c8f-4011-85ac-de45611f8e45\") " Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.674411 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff46e335-2c8f-4011-85ac-de45611f8e45-kube-api-access-49bqs" (OuterVolumeSpecName: "kube-api-access-49bqs") pod "ff46e335-2c8f-4011-85ac-de45611f8e45" (UID: "ff46e335-2c8f-4011-85ac-de45611f8e45"). InnerVolumeSpecName "kube-api-access-49bqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.711342 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff46e335-2c8f-4011-85ac-de45611f8e45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff46e335-2c8f-4011-85ac-de45611f8e45" (UID: "ff46e335-2c8f-4011-85ac-de45611f8e45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.721525 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff46e335-2c8f-4011-85ac-de45611f8e45-config-data" (OuterVolumeSpecName: "config-data") pod "ff46e335-2c8f-4011-85ac-de45611f8e45" (UID: "ff46e335-2c8f-4011-85ac-de45611f8e45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.750665 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff46e335-2c8f-4011-85ac-de45611f8e45-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.750950 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff46e335-2c8f-4011-85ac-de45611f8e45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:04 crc kubenswrapper[4776]: I1208 09:26:04.750968 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49bqs\" (UniqueName: \"kubernetes.io/projected/ff46e335-2c8f-4011-85ac-de45611f8e45-kube-api-access-49bqs\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:04 crc kubenswrapper[4776]: E1208 09:26:04.754703 4776 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/758570530d4397049f3551af08b5e506ddb46aa49a21cefd1fbd3f23f2ad002c/diff" to get inode usage: stat /var/lib/containers/storage/overlay/758570530d4397049f3551af08b5e506ddb46aa49a21cefd1fbd3f23f2ad002c/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_dnsmasq-dns-9b86998b5-hzfv9_474cb911-9e81-43ed-a828-52d9f03eb4df/dnsmasq-dns/0.log" to get inode usage: stat /var/log/pods/openstack_dnsmasq-dns-9b86998b5-hzfv9_474cb911-9e81-43ed-a828-52d9f03eb4df/dnsmasq-dns/0.log: no such file or directory Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.159880 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.245571 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.245793 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff46e335-2c8f-4011-85ac-de45611f8e45","Type":"ContainerDied","Data":"e51b3546b58f53c9203b61c2ae4ad70e973d001366e664d69fa4752ba7f8148b"} Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.245840 4776 scope.go:117] "RemoveContainer" containerID="b65eae25ec32cadfd6e7546c1297efa384bb5d0626ffa9c5dcb1b9ebe1dcb88d" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.278456 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9beda06-9357-4e94-a243-78484ede0b97","Type":"ContainerStarted","Data":"d26a58055f2eb86cbf5246a2502f9f12d8f99264fdcda9ed097cb1600fc9f4af"} Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.294323 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.313721 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56c71de4-c00f-47d6-87d7-c5eb97b88eef","Type":"ContainerStarted","Data":"4fbe4dd35266d65ab9298c77c37f0e0d706906ff802225d1422559dca949e79f"} Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.352277 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.383319 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 09:26:05 crc kubenswrapper[4776]: E1208 09:26:05.384068 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff46e335-2c8f-4011-85ac-de45611f8e45" containerName="nova-scheduler-scheduler" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.384163 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff46e335-2c8f-4011-85ac-de45611f8e45" containerName="nova-scheduler-scheduler" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.384555 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff46e335-2c8f-4011-85ac-de45611f8e45" containerName="nova-scheduler-scheduler" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.385660 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.398198 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.399506 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.469354 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9frq\" (UniqueName: \"kubernetes.io/projected/e7651697-0db7-476f-8b50-1f04771b4ed2-kube-api-access-g9frq\") pod \"nova-scheduler-0\" (UID: \"e7651697-0db7-476f-8b50-1f04771b4ed2\") " pod="openstack/nova-scheduler-0" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.469812 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7651697-0db7-476f-8b50-1f04771b4ed2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e7651697-0db7-476f-8b50-1f04771b4ed2\") " pod="openstack/nova-scheduler-0" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.470121 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7651697-0db7-476f-8b50-1f04771b4ed2-config-data\") pod \"nova-scheduler-0\" (UID: \"e7651697-0db7-476f-8b50-1f04771b4ed2\") " pod="openstack/nova-scheduler-0" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.572788 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7651697-0db7-476f-8b50-1f04771b4ed2-config-data\") pod \"nova-scheduler-0\" (UID: \"e7651697-0db7-476f-8b50-1f04771b4ed2\") " pod="openstack/nova-scheduler-0" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.572845 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9frq\" (UniqueName: \"kubernetes.io/projected/e7651697-0db7-476f-8b50-1f04771b4ed2-kube-api-access-g9frq\") pod \"nova-scheduler-0\" (UID: \"e7651697-0db7-476f-8b50-1f04771b4ed2\") " pod="openstack/nova-scheduler-0" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.572987 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7651697-0db7-476f-8b50-1f04771b4ed2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e7651697-0db7-476f-8b50-1f04771b4ed2\") " pod="openstack/nova-scheduler-0" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.575922 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7651697-0db7-476f-8b50-1f04771b4ed2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e7651697-0db7-476f-8b50-1f04771b4ed2\") " pod="openstack/nova-scheduler-0" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.577225 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7651697-0db7-476f-8b50-1f04771b4ed2-config-data\") pod \"nova-scheduler-0\" (UID: \"e7651697-0db7-476f-8b50-1f04771b4ed2\") " pod="openstack/nova-scheduler-0" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.590882 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9frq\" (UniqueName: \"kubernetes.io/projected/e7651697-0db7-476f-8b50-1f04771b4ed2-kube-api-access-g9frq\") pod \"nova-scheduler-0\" (UID: \"e7651697-0db7-476f-8b50-1f04771b4ed2\") " pod="openstack/nova-scheduler-0" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.864115 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.965423 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f8647023-4573-46b1-a713-c153d75d160b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.244:8775/\": read tcp 10.217.0.2:52024->10.217.0.244:8775: read: connection reset by peer" Dec 08 09:26:05 crc kubenswrapper[4776]: I1208 09:26:05.965447 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f8647023-4573-46b1-a713-c153d75d160b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.244:8775/\": read tcp 10.217.0.2:52028->10.217.0.244:8775: read: connection reset by peer" Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.389979 4776 generic.go:334] "Generic (PLEG): container finished" podID="f8647023-4573-46b1-a713-c153d75d160b" containerID="79f3a895f3e200395ad04fd83a6567b267b9c69b94cebcb3229f29398657dd93" exitCode=0 Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.395536 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff46e335-2c8f-4011-85ac-de45611f8e45" path="/var/lib/kubelet/pods/ff46e335-2c8f-4011-85ac-de45611f8e45/volumes" Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.396312 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9beda06-9357-4e94-a243-78484ede0b97","Type":"ContainerStarted","Data":"51856807f513f51eb9f9a8ebbc14687300eaf8a6ba8e87b28783e2d333a12aa6"} Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.397140 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8647023-4573-46b1-a713-c153d75d160b","Type":"ContainerDied","Data":"79f3a895f3e200395ad04fd83a6567b267b9c69b94cebcb3229f29398657dd93"} Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.404158 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56c71de4-c00f-47d6-87d7-c5eb97b88eef","Type":"ContainerStarted","Data":"26c863532470e226f500c60df71ca64f303879e26e33b0d4b29fe70e75cfc572"} Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.404226 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56c71de4-c00f-47d6-87d7-c5eb97b88eef","Type":"ContainerStarted","Data":"d134e9b8bd74a56d87fa727b52f7659b9e99a893c9e673191a48c0d523f7ec64"} Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.421961 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.421944959 podStartE2EDuration="2.421944959s" podCreationTimestamp="2025-12-08 09:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:26:06.421401325 +0000 UTC m=+1642.684626347" watchObservedRunningTime="2025-12-08 09:26:06.421944959 +0000 UTC m=+1642.685169991" Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.458847 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.550670 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.594600 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-config-data\") pod \"f8647023-4573-46b1-a713-c153d75d160b\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.594924 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8647023-4573-46b1-a713-c153d75d160b-logs\") pod \"f8647023-4573-46b1-a713-c153d75d160b\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.595212 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-combined-ca-bundle\") pod \"f8647023-4573-46b1-a713-c153d75d160b\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.595259 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-nova-metadata-tls-certs\") pod \"f8647023-4573-46b1-a713-c153d75d160b\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.595287 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7psmc\" (UniqueName: \"kubernetes.io/projected/f8647023-4573-46b1-a713-c153d75d160b-kube-api-access-7psmc\") pod \"f8647023-4573-46b1-a713-c153d75d160b\" (UID: \"f8647023-4573-46b1-a713-c153d75d160b\") " Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.597371 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8647023-4573-46b1-a713-c153d75d160b-logs" (OuterVolumeSpecName: "logs") pod "f8647023-4573-46b1-a713-c153d75d160b" (UID: "f8647023-4573-46b1-a713-c153d75d160b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.604387 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8647023-4573-46b1-a713-c153d75d160b-kube-api-access-7psmc" (OuterVolumeSpecName: "kube-api-access-7psmc") pod "f8647023-4573-46b1-a713-c153d75d160b" (UID: "f8647023-4573-46b1-a713-c153d75d160b"). InnerVolumeSpecName "kube-api-access-7psmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.631982 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8647023-4573-46b1-a713-c153d75d160b" (UID: "f8647023-4573-46b1-a713-c153d75d160b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.656188 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-config-data" (OuterVolumeSpecName: "config-data") pod "f8647023-4573-46b1-a713-c153d75d160b" (UID: "f8647023-4573-46b1-a713-c153d75d160b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.697675 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.697704 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8647023-4573-46b1-a713-c153d75d160b-logs\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.697713 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.697723 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7psmc\" (UniqueName: \"kubernetes.io/projected/f8647023-4573-46b1-a713-c153d75d160b-kube-api-access-7psmc\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.711434 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f8647023-4573-46b1-a713-c153d75d160b" (UID: "f8647023-4573-46b1-a713-c153d75d160b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:06 crc kubenswrapper[4776]: I1208 09:26:06.799211 4776 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8647023-4573-46b1-a713-c153d75d160b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.419506 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e7651697-0db7-476f-8b50-1f04771b4ed2","Type":"ContainerStarted","Data":"bf7a3a7f3bcf1a661a4c4f652225cd04f6b6d95245d4572d89c46d08970cace0"} Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.419576 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e7651697-0db7-476f-8b50-1f04771b4ed2","Type":"ContainerStarted","Data":"21048075325cf4c6af5980977430c05737dea01567545e56907b602bc18ec86e"} Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.421645 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9beda06-9357-4e94-a243-78484ede0b97","Type":"ContainerStarted","Data":"0aa8064585618225c49d48e4d16d160546abacdf2ed8d5dde5383fe82da4e8cf"} Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.421781 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.423550 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.423566 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8647023-4573-46b1-a713-c153d75d160b","Type":"ContainerDied","Data":"a51ccb03dee95d3e3f412f85820fb5d8d892c2c34d42ac8b795bb96353511d1e"} Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.423631 4776 scope.go:117] "RemoveContainer" containerID="79f3a895f3e200395ad04fd83a6567b267b9c69b94cebcb3229f29398657dd93" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.445456 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.445435267 podStartE2EDuration="2.445435267s" podCreationTimestamp="2025-12-08 09:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:26:07.433858866 +0000 UTC m=+1643.697083888" watchObservedRunningTime="2025-12-08 09:26:07.445435267 +0000 UTC m=+1643.708660289" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.476893 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7797901170000001 podStartE2EDuration="5.476864152s" podCreationTimestamp="2025-12-08 09:26:02 +0000 UTC" firstStartedPulling="2025-12-08 09:26:03.161330895 +0000 UTC m=+1639.424555917" lastFinishedPulling="2025-12-08 09:26:06.85840493 +0000 UTC m=+1643.121629952" observedRunningTime="2025-12-08 09:26:07.455468547 +0000 UTC m=+1643.718693589" watchObservedRunningTime="2025-12-08 09:26:07.476864152 +0000 UTC m=+1643.740089184" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.479930 4776 scope.go:117] "RemoveContainer" containerID="2be87160fc4bd4b7e818817c80fc6b49383cb19dc8120ad2c77d39aa32b0bea8" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.502579 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.531546 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.552878 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:26:07 crc kubenswrapper[4776]: E1208 09:26:07.553424 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8647023-4573-46b1-a713-c153d75d160b" containerName="nova-metadata-metadata" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.553438 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8647023-4573-46b1-a713-c153d75d160b" containerName="nova-metadata-metadata" Dec 08 09:26:07 crc kubenswrapper[4776]: E1208 09:26:07.553493 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8647023-4573-46b1-a713-c153d75d160b" containerName="nova-metadata-log" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.553499 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8647023-4573-46b1-a713-c153d75d160b" containerName="nova-metadata-log" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.553732 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8647023-4573-46b1-a713-c153d75d160b" containerName="nova-metadata-metadata" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.553748 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8647023-4573-46b1-a713-c153d75d160b" containerName="nova-metadata-log" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.555110 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.558000 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.558497 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.565628 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.720286 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6c2fb50-f70b-43cc-a493-b4ffa4292c64-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e6c2fb50-f70b-43cc-a493-b4ffa4292c64\") " pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.720455 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6c2fb50-f70b-43cc-a493-b4ffa4292c64-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e6c2fb50-f70b-43cc-a493-b4ffa4292c64\") " pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.720554 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72wzd\" (UniqueName: \"kubernetes.io/projected/e6c2fb50-f70b-43cc-a493-b4ffa4292c64-kube-api-access-72wzd\") pod \"nova-metadata-0\" (UID: \"e6c2fb50-f70b-43cc-a493-b4ffa4292c64\") " pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.720801 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6c2fb50-f70b-43cc-a493-b4ffa4292c64-logs\") pod \"nova-metadata-0\" (UID: \"e6c2fb50-f70b-43cc-a493-b4ffa4292c64\") " pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.721063 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6c2fb50-f70b-43cc-a493-b4ffa4292c64-config-data\") pod \"nova-metadata-0\" (UID: \"e6c2fb50-f70b-43cc-a493-b4ffa4292c64\") " pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.822950 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6c2fb50-f70b-43cc-a493-b4ffa4292c64-config-data\") pod \"nova-metadata-0\" (UID: \"e6c2fb50-f70b-43cc-a493-b4ffa4292c64\") " pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.823087 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6c2fb50-f70b-43cc-a493-b4ffa4292c64-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e6c2fb50-f70b-43cc-a493-b4ffa4292c64\") " pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.823146 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6c2fb50-f70b-43cc-a493-b4ffa4292c64-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e6c2fb50-f70b-43cc-a493-b4ffa4292c64\") " pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.823205 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72wzd\" (UniqueName: \"kubernetes.io/projected/e6c2fb50-f70b-43cc-a493-b4ffa4292c64-kube-api-access-72wzd\") pod \"nova-metadata-0\" (UID: \"e6c2fb50-f70b-43cc-a493-b4ffa4292c64\") " pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.823310 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6c2fb50-f70b-43cc-a493-b4ffa4292c64-logs\") pod \"nova-metadata-0\" (UID: \"e6c2fb50-f70b-43cc-a493-b4ffa4292c64\") " pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.823770 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6c2fb50-f70b-43cc-a493-b4ffa4292c64-logs\") pod \"nova-metadata-0\" (UID: \"e6c2fb50-f70b-43cc-a493-b4ffa4292c64\") " pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.827777 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6c2fb50-f70b-43cc-a493-b4ffa4292c64-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e6c2fb50-f70b-43cc-a493-b4ffa4292c64\") " pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.828543 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6c2fb50-f70b-43cc-a493-b4ffa4292c64-config-data\") pod \"nova-metadata-0\" (UID: \"e6c2fb50-f70b-43cc-a493-b4ffa4292c64\") " pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.830800 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6c2fb50-f70b-43cc-a493-b4ffa4292c64-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e6c2fb50-f70b-43cc-a493-b4ffa4292c64\") " pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.840925 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72wzd\" (UniqueName: \"kubernetes.io/projected/e6c2fb50-f70b-43cc-a493-b4ffa4292c64-kube-api-access-72wzd\") pod \"nova-metadata-0\" (UID: \"e6c2fb50-f70b-43cc-a493-b4ffa4292c64\") " pod="openstack/nova-metadata-0" Dec 08 09:26:07 crc kubenswrapper[4776]: I1208 09:26:07.870063 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 09:26:08 crc kubenswrapper[4776]: I1208 09:26:08.330926 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 09:26:08 crc kubenswrapper[4776]: I1208 09:26:08.378820 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8647023-4573-46b1-a713-c153d75d160b" path="/var/lib/kubelet/pods/f8647023-4573-46b1-a713-c153d75d160b/volumes" Dec 08 09:26:08 crc kubenswrapper[4776]: I1208 09:26:08.442164 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e6c2fb50-f70b-43cc-a493-b4ffa4292c64","Type":"ContainerStarted","Data":"b9e741592fa325305a175e7ca3a1319db7ff7578360c162b6506c5f2fa64d759"} Dec 08 09:26:09 crc kubenswrapper[4776]: W1208 09:26:09.329625 4776 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc988dd0_b7f8_4793_8922_238ec7c3081b.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc988dd0_b7f8_4793_8922_238ec7c3081b.slice: no such file or directory Dec 08 09:26:09 crc kubenswrapper[4776]: W1208 09:26:09.360134 4776 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bb5f9b1_fbbf_4950_a31d_8de3f994ddd2.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bb5f9b1_fbbf_4950_a31d_8de3f994ddd2.slice: no such file or directory Dec 08 09:26:09 crc kubenswrapper[4776]: W1208 09:26:09.365470 4776 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9434f83d_ef01_4184_b7e7_e26d478e0589.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9434f83d_ef01_4184_b7e7_e26d478e0589.slice: no such file or directory Dec 08 09:26:09 crc kubenswrapper[4776]: E1208 09:26:09.443895 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice/crio-conmon-2be87160fc4bd4b7e818817c80fc6b49383cb19dc8120ad2c77d39aa32b0bea8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice/crio-25ec6d64e4d550aad188f5bf9c1c6906cd4de1837719de1cdc3e17d679d5b853.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice/crio-e51b3546b58f53c9203b61c2ae4ad70e973d001366e664d69fa4752ba7f8148b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-f67f3ab1a1668d9e04c5cd79e1030d334f5d8f0cd3b3f1af06d6b25d1e5bc706.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-a067de7097b95ef850013617661524d3759368af152b1652daee43af5022e6c4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice/crio-conmon-b65eae25ec32cadfd6e7546c1297efa384bb5d0626ffa9c5dcb1b9ebe1dcb88d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice/crio-conmon-79f3a895f3e200395ad04fd83a6567b267b9c69b94cebcb3229f29398657dd93.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-f246e6066e84c7582e111a2e7f327024e5a574ed6b2cb5b105ee762db911c7e3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-278c3db6ab4e3d2e6cb99faf31918c9c5eab103a82a74caa6c858958825f8d14.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice/crio-79f3a895f3e200395ad04fd83a6567b267b9c69b94cebcb3229f29398657dd93.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-e3f5c2b4235a2950e0e59795cc7b8912b807d9f4fd5a548962dea30abd744634\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-9dbce126bf8853edf4f1c03952ba57abdf8542625da28eb66fcc8d5b21342589.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice/crio-b65eae25ec32cadfd6e7546c1297efa384bb5d0626ffa9c5dcb1b9ebe1dcb88d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e62c81_f058_4044_9bb3_e64e5892a4e6.slice/crio-conmon-64831e1fe3bff43093ab2b6e8712d8f7bd0ea7251240539c05884628e2e6a005.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-f67f3ab1a1668d9e04c5cd79e1030d334f5d8f0cd3b3f1af06d6b25d1e5bc706.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-9dbce126bf8853edf4f1c03952ba57abdf8542625da28eb66fcc8d5b21342589.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-conmon-2e51c1ea01b38323d942e9c96fc1a6ccdffedfd437e17ebbca279d55f9072135.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice/crio-7fe3f2599b53a869df7e6c2dcb468e6f3ae530cb00400251edb047dc618c43c4\": RecentStats: unable to find data in memory cache], [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice/crio-conmon-25ec6d64e4d550aad188f5bf9c1c6906cd4de1837719de1cdc3e17d679d5b853.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-2e51c1ea01b38323d942e9c96fc1a6ccdffedfd437e17ebbca279d55f9072135.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-conmon-3d0078c422c14eb90a83825f183764ed8c85b60f82f3c7def9c1d237c12d5859.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-278c3db6ab4e3d2e6cb99faf31918c9c5eab103a82a74caa6c858958825f8d14.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice\": RecentStats: unable to find data in memory cache]" Dec 08 09:26:09 crc kubenswrapper[4776]: E1208 09:26:09.444042 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice/crio-conmon-25ec6d64e4d550aad188f5bf9c1c6906cd4de1837719de1cdc3e17d679d5b853.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-3d0078c422c14eb90a83825f183764ed8c85b60f82f3c7def9c1d237c12d5859.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice/crio-conmon-2be87160fc4bd4b7e818817c80fc6b49383cb19dc8120ad2c77d39aa32b0bea8.scope\": RecentStats: unable to find data in memory cache], [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-9dbce126bf8853edf4f1c03952ba57abdf8542625da28eb66fcc8d5b21342589.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e62c81_f058_4044_9bb3_e64e5892a4e6.slice/crio-conmon-64831e1fe3bff43093ab2b6e8712d8f7bd0ea7251240539c05884628e2e6a005.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-2e51c1ea01b38323d942e9c96fc1a6ccdffedfd437e17ebbca279d55f9072135.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice/crio-conmon-79f3a895f3e200395ad04fd83a6567b267b9c69b94cebcb3229f29398657dd93.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-e3f5c2b4235a2950e0e59795cc7b8912b807d9f4fd5a548962dea30abd744634\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice/crio-e51b3546b58f53c9203b61c2ae4ad70e973d001366e664d69fa4752ba7f8148b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-conmon-2e51c1ea01b38323d942e9c96fc1a6ccdffedfd437e17ebbca279d55f9072135.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-a067de7097b95ef850013617661524d3759368af152b1652daee43af5022e6c4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e62c81_f058_4044_9bb3_e64e5892a4e6.slice/crio-64831e1fe3bff43093ab2b6e8712d8f7bd0ea7251240539c05884628e2e6a005.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-a067de7097b95ef850013617661524d3759368af152b1652daee43af5022e6c4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-f67f3ab1a1668d9e04c5cd79e1030d334f5d8f0cd3b3f1af06d6b25d1e5bc706.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice/crio-7fe3f2599b53a869df7e6c2dcb468e6f3ae530cb00400251edb047dc618c43c4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-278c3db6ab4e3d2e6cb99faf31918c9c5eab103a82a74caa6c858958825f8d14.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice/crio-conmon-b65eae25ec32cadfd6e7546c1297efa384bb5d0626ffa9c5dcb1b9ebe1dcb88d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice/crio-25ec6d64e4d550aad188f5bf9c1c6906cd4de1837719de1cdc3e17d679d5b853.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice/crio-2be87160fc4bd4b7e818817c80fc6b49383cb19dc8120ad2c77d39aa32b0bea8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-9dbce126bf8853edf4f1c03952ba57abdf8542625da28eb66fcc8d5b21342589.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-f246e6066e84c7582e111a2e7f327024e5a574ed6b2cb5b105ee762db911c7e3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-conmon-3d0078c422c14eb90a83825f183764ed8c85b60f82f3c7def9c1d237c12d5859.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-278c3db6ab4e3d2e6cb99faf31918c9c5eab103a82a74caa6c858958825f8d14.scope\": RecentStats: unable to find data in memory cache]" Dec 08 09:26:09 crc kubenswrapper[4776]: E1208 09:26:09.444298 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-f246e6066e84c7582e111a2e7f327024e5a574ed6b2cb5b105ee762db911c7e3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-e3f5c2b4235a2950e0e59795cc7b8912b807d9f4fd5a548962dea30abd744634\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice/crio-2be87160fc4bd4b7e818817c80fc6b49383cb19dc8120ad2c77d39aa32b0bea8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice/crio-conmon-79f3a895f3e200395ad04fd83a6567b267b9c69b94cebcb3229f29398657dd93.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e62c81_f058_4044_9bb3_e64e5892a4e6.slice/crio-64831e1fe3bff43093ab2b6e8712d8f7bd0ea7251240539c05884628e2e6a005.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-a067de7097b95ef850013617661524d3759368af152b1652daee43af5022e6c4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-9dbce126bf8853edf4f1c03952ba57abdf8542625da28eb66fcc8d5b21342589.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-9dbce126bf8853edf4f1c03952ba57abdf8542625da28eb66fcc8d5b21342589.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e62c81_f058_4044_9bb3_e64e5892a4e6.slice/crio-conmon-64831e1fe3bff43093ab2b6e8712d8f7bd0ea7251240539c05884628e2e6a005.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-f67f3ab1a1668d9e04c5cd79e1030d334f5d8f0cd3b3f1af06d6b25d1e5bc706.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice/crio-conmon-2be87160fc4bd4b7e818817c80fc6b49383cb19dc8120ad2c77d39aa32b0bea8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-a067de7097b95ef850013617661524d3759368af152b1652daee43af5022e6c4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice/crio-25ec6d64e4d550aad188f5bf9c1c6906cd4de1837719de1cdc3e17d679d5b853.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-3d0078c422c14eb90a83825f183764ed8c85b60f82f3c7def9c1d237c12d5859.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice/crio-e51b3546b58f53c9203b61c2ae4ad70e973d001366e664d69fa4752ba7f8148b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-conmon-3d0078c422c14eb90a83825f183764ed8c85b60f82f3c7def9c1d237c12d5859.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-278c3db6ab4e3d2e6cb99faf31918c9c5eab103a82a74caa6c858958825f8d14.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice/crio-7fe3f2599b53a869df7e6c2dcb468e6f3ae530cb00400251edb047dc618c43c4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-conmon-2e51c1ea01b38323d942e9c96fc1a6ccdffedfd437e17ebbca279d55f9072135.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice/crio-conmon-b65eae25ec32cadfd6e7546c1297efa384bb5d0626ffa9c5dcb1b9ebe1dcb88d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice/crio-b65eae25ec32cadfd6e7546c1297efa384bb5d0626ffa9c5dcb1b9ebe1dcb88d.scope\": RecentStats: unable to find data in memory cache], [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-2e51c1ea01b38323d942e9c96fc1a6ccdffedfd437e17ebbca279d55f9072135.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice/crio-conmon-25ec6d64e4d550aad188f5bf9c1c6906cd4de1837719de1cdc3e17d679d5b853.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-278c3db6ab4e3d2e6cb99faf31918c9c5eab103a82a74caa6c858958825f8d14.scope\": RecentStats: unable to find data in memory cache]" Dec 08 09:26:09 crc kubenswrapper[4776]: E1208 09:26:09.444559 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-conmon-2e51c1ea01b38323d942e9c96fc1a6ccdffedfd437e17ebbca279d55f9072135.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice/crio-conmon-b65eae25ec32cadfd6e7546c1297efa384bb5d0626ffa9c5dcb1b9ebe1dcb88d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-f67f3ab1a1668d9e04c5cd79e1030d334f5d8f0cd3b3f1af06d6b25d1e5bc706.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice/crio-2be87160fc4bd4b7e818817c80fc6b49383cb19dc8120ad2c77d39aa32b0bea8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice/crio-conmon-25ec6d64e4d550aad188f5bf9c1c6906cd4de1837719de1cdc3e17d679d5b853.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-278c3db6ab4e3d2e6cb99faf31918c9c5eab103a82a74caa6c858958825f8d14.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-conmon-3d0078c422c14eb90a83825f183764ed8c85b60f82f3c7def9c1d237c12d5859.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e62c81_f058_4044_9bb3_e64e5892a4e6.slice/crio-conmon-64831e1fe3bff43093ab2b6e8712d8f7bd0ea7251240539c05884628e2e6a005.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-a067de7097b95ef850013617661524d3759368af152b1652daee43af5022e6c4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-f67f3ab1a1668d9e04c5cd79e1030d334f5d8f0cd3b3f1af06d6b25d1e5bc706.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-a067de7097b95ef850013617661524d3759368af152b1652daee43af5022e6c4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice/crio-7fe3f2599b53a869df7e6c2dcb468e6f3ae530cb00400251edb047dc618c43c4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice/crio-e51b3546b58f53c9203b61c2ae4ad70e973d001366e664d69fa4752ba7f8148b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-2e51c1ea01b38323d942e9c96fc1a6ccdffedfd437e17ebbca279d55f9072135.scope\": RecentStats: unable to find data in memory cache], [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice/crio-conmon-79f3a895f3e200395ad04fd83a6567b267b9c69b94cebcb3229f29398657dd93.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e62c81_f058_4044_9bb3_e64e5892a4e6.slice/crio-64831e1fe3bff43093ab2b6e8712d8f7bd0ea7251240539c05884628e2e6a005.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-3d0078c422c14eb90a83825f183764ed8c85b60f82f3c7def9c1d237c12d5859.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-9dbce126bf8853edf4f1c03952ba57abdf8542625da28eb66fcc8d5b21342589.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-9dbce126bf8853edf4f1c03952ba57abdf8542625da28eb66fcc8d5b21342589.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice/crio-conmon-2be87160fc4bd4b7e818817c80fc6b49383cb19dc8120ad2c77d39aa32b0bea8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice/crio-b65eae25ec32cadfd6e7546c1297efa384bb5d0626ffa9c5dcb1b9ebe1dcb88d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-e3f5c2b4235a2950e0e59795cc7b8912b807d9f4fd5a548962dea30abd744634\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice\": RecentStats: unable to find data in memory cache]" Dec 08 09:26:09 crc kubenswrapper[4776]: E1208 09:26:09.448845 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice/crio-b65eae25ec32cadfd6e7546c1297efa384bb5d0626ffa9c5dcb1b9ebe1dcb88d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-9dbce126bf8853edf4f1c03952ba57abdf8542625da28eb66fcc8d5b21342589.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-3d0078c422c14eb90a83825f183764ed8c85b60f82f3c7def9c1d237c12d5859.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-a067de7097b95ef850013617661524d3759368af152b1652daee43af5022e6c4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-a067de7097b95ef850013617661524d3759368af152b1652daee43af5022e6c4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-278c3db6ab4e3d2e6cb99faf31918c9c5eab103a82a74caa6c858958825f8d14.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-9dbce126bf8853edf4f1c03952ba57abdf8542625da28eb66fcc8d5b21342589.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-f67f3ab1a1668d9e04c5cd79e1030d334f5d8f0cd3b3f1af06d6b25d1e5bc706.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice/crio-conmon-2be87160fc4bd4b7e818817c80fc6b49383cb19dc8120ad2c77d39aa32b0bea8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice/crio-e51b3546b58f53c9203b61c2ae4ad70e973d001366e664d69fa4752ba7f8148b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-f246e6066e84c7582e111a2e7f327024e5a574ed6b2cb5b105ee762db911c7e3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-f67f3ab1a1668d9e04c5cd79e1030d334f5d8f0cd3b3f1af06d6b25d1e5bc706.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice/crio-conmon-25ec6d64e4d550aad188f5bf9c1c6906cd4de1837719de1cdc3e17d679d5b853.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice/crio-25ec6d64e4d550aad188f5bf9c1c6906cd4de1837719de1cdc3e17d679d5b853.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-conmon-3d0078c422c14eb90a83825f183764ed8c85b60f82f3c7def9c1d237c12d5859.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-conmon-2e51c1ea01b38323d942e9c96fc1a6ccdffedfd437e17ebbca279d55f9072135.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f1949a_9307_4999_b50c_2d5a747e4571.slice/crio-conmon-278c3db6ab4e3d2e6cb99faf31918c9c5eab103a82a74caa6c858958825f8d14.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice/crio-conmon-79f3a895f3e200395ad04fd83a6567b267b9c69b94cebcb3229f29398657dd93.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff46e335_2c8f_4011_85ac_de45611f8e45.slice/crio-conmon-b65eae25ec32cadfd6e7546c1297efa384bb5d0626ffa9c5dcb1b9ebe1dcb88d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8647023_4573_46b1_a713_c153d75d160b.slice/crio-2be87160fc4bd4b7e818817c80fc6b49383cb19dc8120ad2c77d39aa32b0bea8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474cb911_9e81_43ed_a828_52d9f03eb4df.slice/crio-7fe3f2599b53a869df7e6c2dcb468e6f3ae530cb00400251edb047dc618c43c4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e62c81_f058_4044_9bb3_e64e5892a4e6.slice/crio-conmon-64831e1fe3bff43093ab2b6e8712d8f7bd0ea7251240539c05884628e2e6a005.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe2799e_2d16_4569_874b_a0066d38087b.slice/crio-e3f5c2b4235a2950e0e59795cc7b8912b807d9f4fd5a548962dea30abd744634\": RecentStats: unable to find data in memory cache], [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Dec 08 09:26:09 crc kubenswrapper[4776]: I1208 09:26:09.469445 4776 generic.go:334] "Generic (PLEG): container finished" podID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerID="64831e1fe3bff43093ab2b6e8712d8f7bd0ea7251240539c05884628e2e6a005" exitCode=137 Dec 08 09:26:09 crc kubenswrapper[4776]: I1208 09:26:09.469493 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"18e62c81-f058-4044-9bb3-e64e5892a4e6","Type":"ContainerDied","Data":"64831e1fe3bff43093ab2b6e8712d8f7bd0ea7251240539c05884628e2e6a005"} Dec 08 09:26:09 crc kubenswrapper[4776]: I1208 09:26:09.481030 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e6c2fb50-f70b-43cc-a493-b4ffa4292c64","Type":"ContainerStarted","Data":"6e05f1438b9edcb74cb34fb27b9e5f0fbccb2a03bc9607cd7ee38af05fca6e81"} Dec 08 09:26:09 crc kubenswrapper[4776]: I1208 09:26:09.481074 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e6c2fb50-f70b-43cc-a493-b4ffa4292c64","Type":"ContainerStarted","Data":"7d6a0401b3b3a8bd7b76b055d1b237293cffe3d150d925430018a69f99e4a079"} Dec 08 09:26:09 crc kubenswrapper[4776]: I1208 09:26:09.503070 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.50305585 podStartE2EDuration="2.50305585s" podCreationTimestamp="2025-12-08 09:26:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:26:09.501079116 +0000 UTC m=+1645.764304158" watchObservedRunningTime="2025-12-08 09:26:09.50305585 +0000 UTC m=+1645.766280872" Dec 08 09:26:09 crc kubenswrapper[4776]: I1208 09:26:09.835507 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 08 09:26:09 crc kubenswrapper[4776]: I1208 09:26:09.874638 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-combined-ca-bundle\") pod \"18e62c81-f058-4044-9bb3-e64e5892a4e6\" (UID: \"18e62c81-f058-4044-9bb3-e64e5892a4e6\") " Dec 08 09:26:09 crc kubenswrapper[4776]: I1208 09:26:09.874759 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-scripts\") pod \"18e62c81-f058-4044-9bb3-e64e5892a4e6\" (UID: \"18e62c81-f058-4044-9bb3-e64e5892a4e6\") " Dec 08 09:26:09 crc kubenswrapper[4776]: I1208 09:26:09.874840 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnsl9\" (UniqueName: \"kubernetes.io/projected/18e62c81-f058-4044-9bb3-e64e5892a4e6-kube-api-access-gnsl9\") pod \"18e62c81-f058-4044-9bb3-e64e5892a4e6\" (UID: \"18e62c81-f058-4044-9bb3-e64e5892a4e6\") " Dec 08 09:26:09 crc kubenswrapper[4776]: I1208 09:26:09.874928 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-config-data\") pod \"18e62c81-f058-4044-9bb3-e64e5892a4e6\" (UID: \"18e62c81-f058-4044-9bb3-e64e5892a4e6\") " Dec 08 09:26:09 crc kubenswrapper[4776]: I1208 09:26:09.881400 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-scripts" (OuterVolumeSpecName: "scripts") pod "18e62c81-f058-4044-9bb3-e64e5892a4e6" (UID: "18e62c81-f058-4044-9bb3-e64e5892a4e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:09 crc kubenswrapper[4776]: I1208 09:26:09.882496 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e62c81-f058-4044-9bb3-e64e5892a4e6-kube-api-access-gnsl9" (OuterVolumeSpecName: "kube-api-access-gnsl9") pod "18e62c81-f058-4044-9bb3-e64e5892a4e6" (UID: "18e62c81-f058-4044-9bb3-e64e5892a4e6"). InnerVolumeSpecName "kube-api-access-gnsl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:26:09 crc kubenswrapper[4776]: I1208 09:26:09.979374 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:09 crc kubenswrapper[4776]: I1208 09:26:09.979405 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnsl9\" (UniqueName: \"kubernetes.io/projected/18e62c81-f058-4044-9bb3-e64e5892a4e6-kube-api-access-gnsl9\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.010395 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18e62c81-f058-4044-9bb3-e64e5892a4e6" (UID: "18e62c81-f058-4044-9bb3-e64e5892a4e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.071240 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-config-data" (OuterVolumeSpecName: "config-data") pod "18e62c81-f058-4044-9bb3-e64e5892a4e6" (UID: "18e62c81-f058-4044-9bb3-e64e5892a4e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.081275 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.081390 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e62c81-f058-4044-9bb3-e64e5892a4e6-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.510622 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.511008 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"18e62c81-f058-4044-9bb3-e64e5892a4e6","Type":"ContainerDied","Data":"43f9ca7ae59440bd2eb87b148dac60c8753172b6abb7a34c331734c36b675dea"} Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.511046 4776 scope.go:117] "RemoveContainer" containerID="64831e1fe3bff43093ab2b6e8712d8f7bd0ea7251240539c05884628e2e6a005" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.552835 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.564406 4776 scope.go:117] "RemoveContainer" containerID="f514feeb66285d8fc9e0b189ba31e987d1f136c05518d2f1f6f19dd5b51bd824" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.568931 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.586890 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 08 09:26:10 crc kubenswrapper[4776]: E1208 09:26:10.587484 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerName="aodh-api" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.587506 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerName="aodh-api" Dec 08 09:26:10 crc kubenswrapper[4776]: E1208 09:26:10.587564 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerName="aodh-evaluator" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.587572 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerName="aodh-evaluator" Dec 08 09:26:10 crc kubenswrapper[4776]: E1208 09:26:10.587712 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerName="aodh-listener" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.587728 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerName="aodh-listener" Dec 08 09:26:10 crc kubenswrapper[4776]: E1208 09:26:10.587740 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerName="aodh-notifier" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.587746 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerName="aodh-notifier" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.588056 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerName="aodh-listener" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.588088 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerName="aodh-notifier" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.588109 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerName="aodh-evaluator" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.588128 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" containerName="aodh-api" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.600871 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.607442 4776 scope.go:117] "RemoveContainer" containerID="7f5bdee8fb273e9371d024ad04a26b136a3700dce97ab99c9a229cb649912b5b" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.610283 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.610446 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.614743 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.615281 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.623424 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-rtn2h" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.647680 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.647940 4776 scope.go:117] "RemoveContainer" containerID="64b3b939b07a3a8fe0eb2b6fbefd5433d8dc0237a675fa60a0b7dcd76faeaa5a" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.719937 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-config-data\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.719989 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-internal-tls-certs\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.720320 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6zgx\" (UniqueName: \"kubernetes.io/projected/e533f562-6dd5-4117-8d18-f2d222228480-kube-api-access-q6zgx\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.727590 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-public-tls-certs\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.727679 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.727848 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-scripts\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.829335 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-scripts\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.829413 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-config-data\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.829433 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-internal-tls-certs\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.829584 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6zgx\" (UniqueName: \"kubernetes.io/projected/e533f562-6dd5-4117-8d18-f2d222228480-kube-api-access-q6zgx\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.829635 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-public-tls-certs\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.829667 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.834383 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-scripts\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.834398 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.834557 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-internal-tls-certs\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.837412 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-public-tls-certs\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.838955 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-config-data\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.848325 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6zgx\" (UniqueName: \"kubernetes.io/projected/e533f562-6dd5-4117-8d18-f2d222228480-kube-api-access-q6zgx\") pod \"aodh-0\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " pod="openstack/aodh-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.865317 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 08 09:26:10 crc kubenswrapper[4776]: I1208 09:26:10.931438 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 08 09:26:11 crc kubenswrapper[4776]: I1208 09:26:11.513887 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 08 09:26:12 crc kubenswrapper[4776]: I1208 09:26:12.359198 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e62c81-f058-4044-9bb3-e64e5892a4e6" path="/var/lib/kubelet/pods/18e62c81-f058-4044-9bb3-e64e5892a4e6/volumes" Dec 08 09:26:12 crc kubenswrapper[4776]: I1208 09:26:12.537935 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e533f562-6dd5-4117-8d18-f2d222228480","Type":"ContainerStarted","Data":"604f0b0b6825f943aa930e04542ae21e37a278b9e5e1123fed6619d9245ec23f"} Dec 08 09:26:12 crc kubenswrapper[4776]: I1208 09:26:12.538225 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e533f562-6dd5-4117-8d18-f2d222228480","Type":"ContainerStarted","Data":"d8e7588a5e6e682fb545c0ec007078c0a95599587bc3b61da7ad6abc04d2c2ae"} Dec 08 09:26:12 crc kubenswrapper[4776]: I1208 09:26:12.870379 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 08 09:26:12 crc kubenswrapper[4776]: I1208 09:26:12.870709 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 08 09:26:13 crc kubenswrapper[4776]: I1208 09:26:13.561628 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e533f562-6dd5-4117-8d18-f2d222228480","Type":"ContainerStarted","Data":"81d4884f440dc5ffde16685ff694e3ef6dfd968617a01d1bc7a0406b840060bf"} Dec 08 09:26:13 crc kubenswrapper[4776]: I1208 09:26:13.561944 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e533f562-6dd5-4117-8d18-f2d222228480","Type":"ContainerStarted","Data":"a75bd7e41751d9eb41786a0d1e3838a9db01099b40b5389c62ea420502eb97f9"} Dec 08 09:26:14 crc kubenswrapper[4776]: I1208 09:26:14.359356 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:26:14 crc kubenswrapper[4776]: E1208 09:26:14.361045 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:26:14 crc kubenswrapper[4776]: I1208 09:26:14.575969 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 08 09:26:14 crc kubenswrapper[4776]: I1208 09:26:14.576322 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 08 09:26:14 crc kubenswrapper[4776]: I1208 09:26:14.577914 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e533f562-6dd5-4117-8d18-f2d222228480","Type":"ContainerStarted","Data":"264a3383a36980d388776beba83c3af3e71c83b9da01be1a395720f49b840dbd"} Dec 08 09:26:14 crc kubenswrapper[4776]: I1208 09:26:14.617192 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.163109283 podStartE2EDuration="4.617159119s" podCreationTimestamp="2025-12-08 09:26:10 +0000 UTC" firstStartedPulling="2025-12-08 09:26:11.517720787 +0000 UTC m=+1647.780945809" lastFinishedPulling="2025-12-08 09:26:13.971770583 +0000 UTC m=+1650.234995645" observedRunningTime="2025-12-08 09:26:14.602373702 +0000 UTC m=+1650.865598744" watchObservedRunningTime="2025-12-08 09:26:14.617159119 +0000 UTC m=+1650.880384141" Dec 08 09:26:15 crc kubenswrapper[4776]: I1208 09:26:15.592455 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="56c71de4-c00f-47d6-87d7-c5eb97b88eef" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.255:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 08 09:26:15 crc kubenswrapper[4776]: I1208 09:26:15.593147 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="56c71de4-c00f-47d6-87d7-c5eb97b88eef" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.255:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 08 09:26:15 crc kubenswrapper[4776]: I1208 09:26:15.865306 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 08 09:26:15 crc kubenswrapper[4776]: I1208 09:26:15.904192 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 08 09:26:16 crc kubenswrapper[4776]: I1208 09:26:16.687007 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 08 09:26:17 crc kubenswrapper[4776]: I1208 09:26:17.871106 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 08 09:26:17 crc kubenswrapper[4776]: I1208 09:26:17.871164 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 08 09:26:18 crc kubenswrapper[4776]: I1208 09:26:18.887331 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e6c2fb50-f70b-43cc-a493-b4ffa4292c64" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.1:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 08 09:26:18 crc kubenswrapper[4776]: I1208 09:26:18.887317 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e6c2fb50-f70b-43cc-a493-b4ffa4292c64" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.1:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 08 09:26:24 crc kubenswrapper[4776]: I1208 09:26:24.589777 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 08 09:26:24 crc kubenswrapper[4776]: I1208 09:26:24.590889 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 08 09:26:24 crc kubenswrapper[4776]: I1208 09:26:24.592441 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 08 09:26:24 crc kubenswrapper[4776]: I1208 09:26:24.592489 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 08 09:26:24 crc kubenswrapper[4776]: I1208 09:26:24.603484 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 08 09:26:24 crc kubenswrapper[4776]: I1208 09:26:24.606204 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 08 09:26:27 crc kubenswrapper[4776]: I1208 09:26:27.877851 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 08 09:26:27 crc kubenswrapper[4776]: I1208 09:26:27.881758 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 08 09:26:27 crc kubenswrapper[4776]: I1208 09:26:27.886948 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 08 09:26:28 crc kubenswrapper[4776]: I1208 09:26:28.777275 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 08 09:26:29 crc kubenswrapper[4776]: I1208 09:26:29.344767 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:26:29 crc kubenswrapper[4776]: E1208 09:26:29.346563 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:26:32 crc kubenswrapper[4776]: I1208 09:26:32.673441 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 08 09:26:37 crc kubenswrapper[4776]: I1208 09:26:37.822859 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 09:26:37 crc kubenswrapper[4776]: I1208 09:26:37.823656 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced" containerName="kube-state-metrics" containerID="cri-o://84c8fc46c043ccf7fee065dbe52ce4ffd0165e484045d856e3eb4532ff2ef2b7" gracePeriod=30 Dec 08 09:26:37 crc kubenswrapper[4776]: I1208 09:26:37.958196 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 08 09:26:37 crc kubenswrapper[4776]: I1208 09:26:37.962553 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="54ed126e-a923-408f-9ab3-f939a1e74374" containerName="mysqld-exporter" containerID="cri-o://2389731669fe473d7a6e651ee6717ac54faf4f7cacce306be97cebc8a42c2d9e" gracePeriod=30 Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.381840 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.394232 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnwx7\" (UniqueName: \"kubernetes.io/projected/c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced-kube-api-access-wnwx7\") pod \"c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced\" (UID: \"c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced\") " Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.406379 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced-kube-api-access-wnwx7" (OuterVolumeSpecName: "kube-api-access-wnwx7") pod "c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced" (UID: "c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced"). InnerVolumeSpecName "kube-api-access-wnwx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.485134 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.496858 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khmjg\" (UniqueName: \"kubernetes.io/projected/54ed126e-a923-408f-9ab3-f939a1e74374-kube-api-access-khmjg\") pod \"54ed126e-a923-408f-9ab3-f939a1e74374\" (UID: \"54ed126e-a923-408f-9ab3-f939a1e74374\") " Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.496923 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ed126e-a923-408f-9ab3-f939a1e74374-combined-ca-bundle\") pod \"54ed126e-a923-408f-9ab3-f939a1e74374\" (UID: \"54ed126e-a923-408f-9ab3-f939a1e74374\") " Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.497031 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ed126e-a923-408f-9ab3-f939a1e74374-config-data\") pod \"54ed126e-a923-408f-9ab3-f939a1e74374\" (UID: \"54ed126e-a923-408f-9ab3-f939a1e74374\") " Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.497687 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnwx7\" (UniqueName: \"kubernetes.io/projected/c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced-kube-api-access-wnwx7\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.500217 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ed126e-a923-408f-9ab3-f939a1e74374-kube-api-access-khmjg" (OuterVolumeSpecName: "kube-api-access-khmjg") pod "54ed126e-a923-408f-9ab3-f939a1e74374" (UID: "54ed126e-a923-408f-9ab3-f939a1e74374"). InnerVolumeSpecName "kube-api-access-khmjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.534349 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ed126e-a923-408f-9ab3-f939a1e74374-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54ed126e-a923-408f-9ab3-f939a1e74374" (UID: "54ed126e-a923-408f-9ab3-f939a1e74374"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.581240 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ed126e-a923-408f-9ab3-f939a1e74374-config-data" (OuterVolumeSpecName: "config-data") pod "54ed126e-a923-408f-9ab3-f939a1e74374" (UID: "54ed126e-a923-408f-9ab3-f939a1e74374"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.599834 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ed126e-a923-408f-9ab3-f939a1e74374-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.600225 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khmjg\" (UniqueName: \"kubernetes.io/projected/54ed126e-a923-408f-9ab3-f939a1e74374-kube-api-access-khmjg\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.600238 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ed126e-a923-408f-9ab3-f939a1e74374-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.892047 4776 generic.go:334] "Generic (PLEG): container finished" podID="c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced" containerID="84c8fc46c043ccf7fee065dbe52ce4ffd0165e484045d856e3eb4532ff2ef2b7" exitCode=2 Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.892137 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced","Type":"ContainerDied","Data":"84c8fc46c043ccf7fee065dbe52ce4ffd0165e484045d856e3eb4532ff2ef2b7"} Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.892154 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.892195 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced","Type":"ContainerDied","Data":"beca45767eb528dc45d690740fbcb90c6a2db5a9bff33c7d2ff81af1017e1e80"} Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.892210 4776 scope.go:117] "RemoveContainer" containerID="84c8fc46c043ccf7fee065dbe52ce4ffd0165e484045d856e3eb4532ff2ef2b7" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.900609 4776 generic.go:334] "Generic (PLEG): container finished" podID="54ed126e-a923-408f-9ab3-f939a1e74374" containerID="2389731669fe473d7a6e651ee6717ac54faf4f7cacce306be97cebc8a42c2d9e" exitCode=2 Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.900643 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"54ed126e-a923-408f-9ab3-f939a1e74374","Type":"ContainerDied","Data":"2389731669fe473d7a6e651ee6717ac54faf4f7cacce306be97cebc8a42c2d9e"} Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.900668 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"54ed126e-a923-408f-9ab3-f939a1e74374","Type":"ContainerDied","Data":"343371fcd574618a02056362a1ab892606fc3b8b955ee394fb4318cc9089eb9f"} Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.900726 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.929752 4776 scope.go:117] "RemoveContainer" containerID="84c8fc46c043ccf7fee065dbe52ce4ffd0165e484045d856e3eb4532ff2ef2b7" Dec 08 09:26:38 crc kubenswrapper[4776]: E1208 09:26:38.930517 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84c8fc46c043ccf7fee065dbe52ce4ffd0165e484045d856e3eb4532ff2ef2b7\": container with ID starting with 84c8fc46c043ccf7fee065dbe52ce4ffd0165e484045d856e3eb4532ff2ef2b7 not found: ID does not exist" containerID="84c8fc46c043ccf7fee065dbe52ce4ffd0165e484045d856e3eb4532ff2ef2b7" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.930560 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c8fc46c043ccf7fee065dbe52ce4ffd0165e484045d856e3eb4532ff2ef2b7"} err="failed to get container status \"84c8fc46c043ccf7fee065dbe52ce4ffd0165e484045d856e3eb4532ff2ef2b7\": rpc error: code = NotFound desc = could not find container \"84c8fc46c043ccf7fee065dbe52ce4ffd0165e484045d856e3eb4532ff2ef2b7\": container with ID starting with 84c8fc46c043ccf7fee065dbe52ce4ffd0165e484045d856e3eb4532ff2ef2b7 not found: ID does not exist" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.930582 4776 scope.go:117] "RemoveContainer" containerID="2389731669fe473d7a6e651ee6717ac54faf4f7cacce306be97cebc8a42c2d9e" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.932935 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.944131 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.967789 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 09:26:38 crc kubenswrapper[4776]: E1208 09:26:38.968538 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced" containerName="kube-state-metrics" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.968555 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced" containerName="kube-state-metrics" Dec 08 09:26:38 crc kubenswrapper[4776]: E1208 09:26:38.968591 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ed126e-a923-408f-9ab3-f939a1e74374" containerName="mysqld-exporter" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.968598 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ed126e-a923-408f-9ab3-f939a1e74374" containerName="mysqld-exporter" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.968813 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced" containerName="kube-state-metrics" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.968838 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ed126e-a923-408f-9ab3-f939a1e74374" containerName="mysqld-exporter" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.969687 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.973481 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.973710 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 08 09:26:38 crc kubenswrapper[4776]: I1208 09:26:38.994393 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.008128 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e6d887-db3e-40c6-9411-0e2565e5994d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"90e6d887-db3e-40c6-9411-0e2565e5994d\") " pod="openstack/kube-state-metrics-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.008233 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlxcc\" (UniqueName: \"kubernetes.io/projected/90e6d887-db3e-40c6-9411-0e2565e5994d-kube-api-access-wlxcc\") pod \"kube-state-metrics-0\" (UID: \"90e6d887-db3e-40c6-9411-0e2565e5994d\") " pod="openstack/kube-state-metrics-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.008320 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/90e6d887-db3e-40c6-9411-0e2565e5994d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"90e6d887-db3e-40c6-9411-0e2565e5994d\") " pod="openstack/kube-state-metrics-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.008358 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e6d887-db3e-40c6-9411-0e2565e5994d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"90e6d887-db3e-40c6-9411-0e2565e5994d\") " pod="openstack/kube-state-metrics-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.027747 4776 scope.go:117] "RemoveContainer" containerID="2389731669fe473d7a6e651ee6717ac54faf4f7cacce306be97cebc8a42c2d9e" Dec 08 09:26:39 crc kubenswrapper[4776]: E1208 09:26:39.030824 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2389731669fe473d7a6e651ee6717ac54faf4f7cacce306be97cebc8a42c2d9e\": container with ID starting with 2389731669fe473d7a6e651ee6717ac54faf4f7cacce306be97cebc8a42c2d9e not found: ID does not exist" containerID="2389731669fe473d7a6e651ee6717ac54faf4f7cacce306be97cebc8a42c2d9e" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.030906 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2389731669fe473d7a6e651ee6717ac54faf4f7cacce306be97cebc8a42c2d9e"} err="failed to get container status \"2389731669fe473d7a6e651ee6717ac54faf4f7cacce306be97cebc8a42c2d9e\": rpc error: code = NotFound desc = could not find container \"2389731669fe473d7a6e651ee6717ac54faf4f7cacce306be97cebc8a42c2d9e\": container with ID starting with 2389731669fe473d7a6e651ee6717ac54faf4f7cacce306be97cebc8a42c2d9e not found: ID does not exist" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.042540 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.049901 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.059525 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.061837 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.068940 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.069125 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.075430 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.110207 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cce6b19-9d40-4957-8154-b4d3a50fe2f7-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"4cce6b19-9d40-4957-8154-b4d3a50fe2f7\") " pod="openstack/mysqld-exporter-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.110279 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/90e6d887-db3e-40c6-9411-0e2565e5994d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"90e6d887-db3e-40c6-9411-0e2565e5994d\") " pod="openstack/kube-state-metrics-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.110327 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e6d887-db3e-40c6-9411-0e2565e5994d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"90e6d887-db3e-40c6-9411-0e2565e5994d\") " pod="openstack/kube-state-metrics-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.110420 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n62tp\" (UniqueName: \"kubernetes.io/projected/4cce6b19-9d40-4957-8154-b4d3a50fe2f7-kube-api-access-n62tp\") pod \"mysqld-exporter-0\" (UID: \"4cce6b19-9d40-4957-8154-b4d3a50fe2f7\") " pod="openstack/mysqld-exporter-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.110714 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e6d887-db3e-40c6-9411-0e2565e5994d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"90e6d887-db3e-40c6-9411-0e2565e5994d\") " pod="openstack/kube-state-metrics-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.110823 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cce6b19-9d40-4957-8154-b4d3a50fe2f7-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"4cce6b19-9d40-4957-8154-b4d3a50fe2f7\") " pod="openstack/mysqld-exporter-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.110892 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlxcc\" (UniqueName: \"kubernetes.io/projected/90e6d887-db3e-40c6-9411-0e2565e5994d-kube-api-access-wlxcc\") pod \"kube-state-metrics-0\" (UID: \"90e6d887-db3e-40c6-9411-0e2565e5994d\") " pod="openstack/kube-state-metrics-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.110969 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cce6b19-9d40-4957-8154-b4d3a50fe2f7-config-data\") pod \"mysqld-exporter-0\" (UID: \"4cce6b19-9d40-4957-8154-b4d3a50fe2f7\") " pod="openstack/mysqld-exporter-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.115607 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e6d887-db3e-40c6-9411-0e2565e5994d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"90e6d887-db3e-40c6-9411-0e2565e5994d\") " pod="openstack/kube-state-metrics-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.125754 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/90e6d887-db3e-40c6-9411-0e2565e5994d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"90e6d887-db3e-40c6-9411-0e2565e5994d\") " pod="openstack/kube-state-metrics-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.126517 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlxcc\" (UniqueName: \"kubernetes.io/projected/90e6d887-db3e-40c6-9411-0e2565e5994d-kube-api-access-wlxcc\") pod \"kube-state-metrics-0\" (UID: \"90e6d887-db3e-40c6-9411-0e2565e5994d\") " pod="openstack/kube-state-metrics-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.127344 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e6d887-db3e-40c6-9411-0e2565e5994d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"90e6d887-db3e-40c6-9411-0e2565e5994d\") " pod="openstack/kube-state-metrics-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.212727 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cce6b19-9d40-4957-8154-b4d3a50fe2f7-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"4cce6b19-9d40-4957-8154-b4d3a50fe2f7\") " pod="openstack/mysqld-exporter-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.212820 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cce6b19-9d40-4957-8154-b4d3a50fe2f7-config-data\") pod \"mysqld-exporter-0\" (UID: \"4cce6b19-9d40-4957-8154-b4d3a50fe2f7\") " pod="openstack/mysqld-exporter-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.212883 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cce6b19-9d40-4957-8154-b4d3a50fe2f7-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"4cce6b19-9d40-4957-8154-b4d3a50fe2f7\") " pod="openstack/mysqld-exporter-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.212961 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n62tp\" (UniqueName: \"kubernetes.io/projected/4cce6b19-9d40-4957-8154-b4d3a50fe2f7-kube-api-access-n62tp\") pod \"mysqld-exporter-0\" (UID: \"4cce6b19-9d40-4957-8154-b4d3a50fe2f7\") " pod="openstack/mysqld-exporter-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.218768 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cce6b19-9d40-4957-8154-b4d3a50fe2f7-config-data\") pod \"mysqld-exporter-0\" (UID: \"4cce6b19-9d40-4957-8154-b4d3a50fe2f7\") " pod="openstack/mysqld-exporter-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.219142 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cce6b19-9d40-4957-8154-b4d3a50fe2f7-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"4cce6b19-9d40-4957-8154-b4d3a50fe2f7\") " pod="openstack/mysqld-exporter-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.237686 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cce6b19-9d40-4957-8154-b4d3a50fe2f7-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"4cce6b19-9d40-4957-8154-b4d3a50fe2f7\") " pod="openstack/mysqld-exporter-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.242633 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n62tp\" (UniqueName: \"kubernetes.io/projected/4cce6b19-9d40-4957-8154-b4d3a50fe2f7-kube-api-access-n62tp\") pod \"mysqld-exporter-0\" (UID: \"4cce6b19-9d40-4957-8154-b4d3a50fe2f7\") " pod="openstack/mysqld-exporter-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.324564 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.388588 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.840652 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.914441 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"90e6d887-db3e-40c6-9411-0e2565e5994d","Type":"ContainerStarted","Data":"b3eaf597de8e141e94e34647cfbe56a114cc4f76ef6fda2b70de197043e17a3e"} Dec 08 09:26:39 crc kubenswrapper[4776]: I1208 09:26:39.966090 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 08 09:26:39 crc kubenswrapper[4776]: W1208 09:26:39.972202 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cce6b19_9d40_4957_8154_b4d3a50fe2f7.slice/crio-0717fc675a8d0a0d0460a0010115cd7ed3894656b6ee52278992712b9d12e210 WatchSource:0}: Error finding container 0717fc675a8d0a0d0460a0010115cd7ed3894656b6ee52278992712b9d12e210: Status 404 returned error can't find the container with id 0717fc675a8d0a0d0460a0010115cd7ed3894656b6ee52278992712b9d12e210 Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.207153 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.208518 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9beda06-9357-4e94-a243-78484ede0b97" containerName="ceilometer-central-agent" containerID="cri-o://e9bd9c9a55a952fafee823ba565ef3ae31d83086f77f47611292b05ab8a8cb79" gracePeriod=30 Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.208814 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9beda06-9357-4e94-a243-78484ede0b97" containerName="proxy-httpd" containerID="cri-o://0aa8064585618225c49d48e4d16d160546abacdf2ed8d5dde5383fe82da4e8cf" gracePeriod=30 Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.208949 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9beda06-9357-4e94-a243-78484ede0b97" containerName="sg-core" containerID="cri-o://51856807f513f51eb9f9a8ebbc14687300eaf8a6ba8e87b28783e2d333a12aa6" gracePeriod=30 Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.208965 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9beda06-9357-4e94-a243-78484ede0b97" containerName="ceilometer-notification-agent" containerID="cri-o://d26a58055f2eb86cbf5246a2502f9f12d8f99264fdcda9ed097cb1600fc9f4af" gracePeriod=30 Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.378081 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54ed126e-a923-408f-9ab3-f939a1e74374" path="/var/lib/kubelet/pods/54ed126e-a923-408f-9ab3-f939a1e74374/volumes" Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.379819 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced" path="/var/lib/kubelet/pods/c26a16d6-aae4-4ce2-b1cf-2a26ab0bfced/volumes" Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.934514 4776 generic.go:334] "Generic (PLEG): container finished" podID="a9beda06-9357-4e94-a243-78484ede0b97" containerID="0aa8064585618225c49d48e4d16d160546abacdf2ed8d5dde5383fe82da4e8cf" exitCode=0 Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.934833 4776 generic.go:334] "Generic (PLEG): container finished" podID="a9beda06-9357-4e94-a243-78484ede0b97" containerID="51856807f513f51eb9f9a8ebbc14687300eaf8a6ba8e87b28783e2d333a12aa6" exitCode=2 Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.934846 4776 generic.go:334] "Generic (PLEG): container finished" podID="a9beda06-9357-4e94-a243-78484ede0b97" containerID="e9bd9c9a55a952fafee823ba565ef3ae31d83086f77f47611292b05ab8a8cb79" exitCode=0 Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.934940 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9beda06-9357-4e94-a243-78484ede0b97","Type":"ContainerDied","Data":"0aa8064585618225c49d48e4d16d160546abacdf2ed8d5dde5383fe82da4e8cf"} Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.935009 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9beda06-9357-4e94-a243-78484ede0b97","Type":"ContainerDied","Data":"51856807f513f51eb9f9a8ebbc14687300eaf8a6ba8e87b28783e2d333a12aa6"} Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.935024 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9beda06-9357-4e94-a243-78484ede0b97","Type":"ContainerDied","Data":"e9bd9c9a55a952fafee823ba565ef3ae31d83086f77f47611292b05ab8a8cb79"} Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.944771 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"4cce6b19-9d40-4957-8154-b4d3a50fe2f7","Type":"ContainerStarted","Data":"3360a594a7c8d55bbba39f68f8b48291bdda458f599b99d3f5ed23611bfc60a8"} Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.944826 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"4cce6b19-9d40-4957-8154-b4d3a50fe2f7","Type":"ContainerStarted","Data":"0717fc675a8d0a0d0460a0010115cd7ed3894656b6ee52278992712b9d12e210"} Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.948317 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"90e6d887-db3e-40c6-9411-0e2565e5994d","Type":"ContainerStarted","Data":"a32204a8e8628c305139d449ee700623726664b95a76db10a59e008c567eb599"} Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.948556 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.971532 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=1.3996245950000001 podStartE2EDuration="1.971510334s" podCreationTimestamp="2025-12-08 09:26:39 +0000 UTC" firstStartedPulling="2025-12-08 09:26:39.974539729 +0000 UTC m=+1676.237764751" lastFinishedPulling="2025-12-08 09:26:40.546425468 +0000 UTC m=+1676.809650490" observedRunningTime="2025-12-08 09:26:40.960528258 +0000 UTC m=+1677.223753300" watchObservedRunningTime="2025-12-08 09:26:40.971510334 +0000 UTC m=+1677.234735366" Dec 08 09:26:40 crc kubenswrapper[4776]: I1208 09:26:40.989885 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.550743014 podStartE2EDuration="2.989866416s" podCreationTimestamp="2025-12-08 09:26:38 +0000 UTC" firstStartedPulling="2025-12-08 09:26:39.840013813 +0000 UTC m=+1676.103238845" lastFinishedPulling="2025-12-08 09:26:40.279137215 +0000 UTC m=+1676.542362247" observedRunningTime="2025-12-08 09:26:40.977357821 +0000 UTC m=+1677.240582843" watchObservedRunningTime="2025-12-08 09:26:40.989866416 +0000 UTC m=+1677.253091438" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.630782 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.667557 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-sg-core-conf-yaml\") pod \"a9beda06-9357-4e94-a243-78484ede0b97\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.668948 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-combined-ca-bundle\") pod \"a9beda06-9357-4e94-a243-78484ede0b97\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.669126 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjhcp\" (UniqueName: \"kubernetes.io/projected/a9beda06-9357-4e94-a243-78484ede0b97-kube-api-access-mjhcp\") pod \"a9beda06-9357-4e94-a243-78484ede0b97\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.669279 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-config-data\") pod \"a9beda06-9357-4e94-a243-78484ede0b97\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.669508 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9beda06-9357-4e94-a243-78484ede0b97-run-httpd\") pod \"a9beda06-9357-4e94-a243-78484ede0b97\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.669656 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-scripts\") pod \"a9beda06-9357-4e94-a243-78484ede0b97\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.669773 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9beda06-9357-4e94-a243-78484ede0b97-log-httpd\") pod \"a9beda06-9357-4e94-a243-78484ede0b97\" (UID: \"a9beda06-9357-4e94-a243-78484ede0b97\") " Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.671412 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9beda06-9357-4e94-a243-78484ede0b97-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a9beda06-9357-4e94-a243-78484ede0b97" (UID: "a9beda06-9357-4e94-a243-78484ede0b97"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.672938 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9beda06-9357-4e94-a243-78484ede0b97-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a9beda06-9357-4e94-a243-78484ede0b97" (UID: "a9beda06-9357-4e94-a243-78484ede0b97"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.694809 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9beda06-9357-4e94-a243-78484ede0b97-kube-api-access-mjhcp" (OuterVolumeSpecName: "kube-api-access-mjhcp") pod "a9beda06-9357-4e94-a243-78484ede0b97" (UID: "a9beda06-9357-4e94-a243-78484ede0b97"). InnerVolumeSpecName "kube-api-access-mjhcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.696051 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-scripts" (OuterVolumeSpecName: "scripts") pod "a9beda06-9357-4e94-a243-78484ede0b97" (UID: "a9beda06-9357-4e94-a243-78484ede0b97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.728858 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a9beda06-9357-4e94-a243-78484ede0b97" (UID: "a9beda06-9357-4e94-a243-78484ede0b97"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.772692 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9beda06-9357-4e94-a243-78484ede0b97-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.772720 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.772753 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9beda06-9357-4e94-a243-78484ede0b97-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.772762 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.772773 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjhcp\" (UniqueName: \"kubernetes.io/projected/a9beda06-9357-4e94-a243-78484ede0b97-kube-api-access-mjhcp\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.812440 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9beda06-9357-4e94-a243-78484ede0b97" (UID: "a9beda06-9357-4e94-a243-78484ede0b97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.815346 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-config-data" (OuterVolumeSpecName: "config-data") pod "a9beda06-9357-4e94-a243-78484ede0b97" (UID: "a9beda06-9357-4e94-a243-78484ede0b97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.874809 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.874841 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9beda06-9357-4e94-a243-78484ede0b97-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.966015 4776 generic.go:334] "Generic (PLEG): container finished" podID="a9beda06-9357-4e94-a243-78484ede0b97" containerID="d26a58055f2eb86cbf5246a2502f9f12d8f99264fdcda9ed097cb1600fc9f4af" exitCode=0 Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.966112 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.966109 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9beda06-9357-4e94-a243-78484ede0b97","Type":"ContainerDied","Data":"d26a58055f2eb86cbf5246a2502f9f12d8f99264fdcda9ed097cb1600fc9f4af"} Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.968150 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9beda06-9357-4e94-a243-78484ede0b97","Type":"ContainerDied","Data":"809c2d04a826e03946dedbf3ca8787df7827d029642207128a013ffb7812a07f"} Dec 08 09:26:41 crc kubenswrapper[4776]: I1208 09:26:41.968267 4776 scope.go:117] "RemoveContainer" containerID="0aa8064585618225c49d48e4d16d160546abacdf2ed8d5dde5383fe82da4e8cf" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.031961 4776 scope.go:117] "RemoveContainer" containerID="51856807f513f51eb9f9a8ebbc14687300eaf8a6ba8e87b28783e2d333a12aa6" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.038384 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.056819 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.057527 4776 scope.go:117] "RemoveContainer" containerID="d26a58055f2eb86cbf5246a2502f9f12d8f99264fdcda9ed097cb1600fc9f4af" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.072754 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:42 crc kubenswrapper[4776]: E1208 09:26:42.073330 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9beda06-9357-4e94-a243-78484ede0b97" containerName="sg-core" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.073348 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9beda06-9357-4e94-a243-78484ede0b97" containerName="sg-core" Dec 08 09:26:42 crc kubenswrapper[4776]: E1208 09:26:42.073366 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9beda06-9357-4e94-a243-78484ede0b97" containerName="ceilometer-central-agent" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.073374 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9beda06-9357-4e94-a243-78484ede0b97" containerName="ceilometer-central-agent" Dec 08 09:26:42 crc kubenswrapper[4776]: E1208 09:26:42.073383 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9beda06-9357-4e94-a243-78484ede0b97" containerName="ceilometer-notification-agent" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.073389 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9beda06-9357-4e94-a243-78484ede0b97" containerName="ceilometer-notification-agent" Dec 08 09:26:42 crc kubenswrapper[4776]: E1208 09:26:42.073410 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9beda06-9357-4e94-a243-78484ede0b97" containerName="proxy-httpd" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.073417 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9beda06-9357-4e94-a243-78484ede0b97" containerName="proxy-httpd" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.073654 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9beda06-9357-4e94-a243-78484ede0b97" containerName="sg-core" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.073671 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9beda06-9357-4e94-a243-78484ede0b97" containerName="ceilometer-notification-agent" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.073687 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9beda06-9357-4e94-a243-78484ede0b97" containerName="proxy-httpd" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.073708 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9beda06-9357-4e94-a243-78484ede0b97" containerName="ceilometer-central-agent" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.075863 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.080490 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.080567 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.080694 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.082857 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.088001 4776 scope.go:117] "RemoveContainer" containerID="e9bd9c9a55a952fafee823ba565ef3ae31d83086f77f47611292b05ab8a8cb79" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.140375 4776 scope.go:117] "RemoveContainer" containerID="0aa8064585618225c49d48e4d16d160546abacdf2ed8d5dde5383fe82da4e8cf" Dec 08 09:26:42 crc kubenswrapper[4776]: E1208 09:26:42.142449 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa8064585618225c49d48e4d16d160546abacdf2ed8d5dde5383fe82da4e8cf\": container with ID starting with 0aa8064585618225c49d48e4d16d160546abacdf2ed8d5dde5383fe82da4e8cf not found: ID does not exist" containerID="0aa8064585618225c49d48e4d16d160546abacdf2ed8d5dde5383fe82da4e8cf" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.142495 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa8064585618225c49d48e4d16d160546abacdf2ed8d5dde5383fe82da4e8cf"} err="failed to get container status \"0aa8064585618225c49d48e4d16d160546abacdf2ed8d5dde5383fe82da4e8cf\": rpc error: code = NotFound desc = could not find container \"0aa8064585618225c49d48e4d16d160546abacdf2ed8d5dde5383fe82da4e8cf\": container with ID starting with 0aa8064585618225c49d48e4d16d160546abacdf2ed8d5dde5383fe82da4e8cf not found: ID does not exist" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.142522 4776 scope.go:117] "RemoveContainer" containerID="51856807f513f51eb9f9a8ebbc14687300eaf8a6ba8e87b28783e2d333a12aa6" Dec 08 09:26:42 crc kubenswrapper[4776]: E1208 09:26:42.143131 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51856807f513f51eb9f9a8ebbc14687300eaf8a6ba8e87b28783e2d333a12aa6\": container with ID starting with 51856807f513f51eb9f9a8ebbc14687300eaf8a6ba8e87b28783e2d333a12aa6 not found: ID does not exist" containerID="51856807f513f51eb9f9a8ebbc14687300eaf8a6ba8e87b28783e2d333a12aa6" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.143154 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51856807f513f51eb9f9a8ebbc14687300eaf8a6ba8e87b28783e2d333a12aa6"} err="failed to get container status \"51856807f513f51eb9f9a8ebbc14687300eaf8a6ba8e87b28783e2d333a12aa6\": rpc error: code = NotFound desc = could not find container \"51856807f513f51eb9f9a8ebbc14687300eaf8a6ba8e87b28783e2d333a12aa6\": container with ID starting with 51856807f513f51eb9f9a8ebbc14687300eaf8a6ba8e87b28783e2d333a12aa6 not found: ID does not exist" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.143183 4776 scope.go:117] "RemoveContainer" containerID="d26a58055f2eb86cbf5246a2502f9f12d8f99264fdcda9ed097cb1600fc9f4af" Dec 08 09:26:42 crc kubenswrapper[4776]: E1208 09:26:42.143767 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d26a58055f2eb86cbf5246a2502f9f12d8f99264fdcda9ed097cb1600fc9f4af\": container with ID starting with d26a58055f2eb86cbf5246a2502f9f12d8f99264fdcda9ed097cb1600fc9f4af not found: ID does not exist" containerID="d26a58055f2eb86cbf5246a2502f9f12d8f99264fdcda9ed097cb1600fc9f4af" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.143819 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d26a58055f2eb86cbf5246a2502f9f12d8f99264fdcda9ed097cb1600fc9f4af"} err="failed to get container status \"d26a58055f2eb86cbf5246a2502f9f12d8f99264fdcda9ed097cb1600fc9f4af\": rpc error: code = NotFound desc = could not find container \"d26a58055f2eb86cbf5246a2502f9f12d8f99264fdcda9ed097cb1600fc9f4af\": container with ID starting with d26a58055f2eb86cbf5246a2502f9f12d8f99264fdcda9ed097cb1600fc9f4af not found: ID does not exist" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.143833 4776 scope.go:117] "RemoveContainer" containerID="e9bd9c9a55a952fafee823ba565ef3ae31d83086f77f47611292b05ab8a8cb79" Dec 08 09:26:42 crc kubenswrapper[4776]: E1208 09:26:42.145019 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9bd9c9a55a952fafee823ba565ef3ae31d83086f77f47611292b05ab8a8cb79\": container with ID starting with e9bd9c9a55a952fafee823ba565ef3ae31d83086f77f47611292b05ab8a8cb79 not found: ID does not exist" containerID="e9bd9c9a55a952fafee823ba565ef3ae31d83086f77f47611292b05ab8a8cb79" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.145045 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9bd9c9a55a952fafee823ba565ef3ae31d83086f77f47611292b05ab8a8cb79"} err="failed to get container status \"e9bd9c9a55a952fafee823ba565ef3ae31d83086f77f47611292b05ab8a8cb79\": rpc error: code = NotFound desc = could not find container \"e9bd9c9a55a952fafee823ba565ef3ae31d83086f77f47611292b05ab8a8cb79\": container with ID starting with e9bd9c9a55a952fafee823ba565ef3ae31d83086f77f47611292b05ab8a8cb79 not found: ID does not exist" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.181411 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.181460 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13099ea1-6af0-4664-b33b-318fd3a3dc74-log-httpd\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.181497 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-config-data\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.181524 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhlzm\" (UniqueName: \"kubernetes.io/projected/13099ea1-6af0-4664-b33b-318fd3a3dc74-kube-api-access-mhlzm\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.181539 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.181590 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.181635 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-scripts\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.181680 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13099ea1-6af0-4664-b33b-318fd3a3dc74-run-httpd\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.283509 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13099ea1-6af0-4664-b33b-318fd3a3dc74-run-httpd\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.283615 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.283645 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13099ea1-6af0-4664-b33b-318fd3a3dc74-log-httpd\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.283681 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-config-data\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.283712 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhlzm\" (UniqueName: \"kubernetes.io/projected/13099ea1-6af0-4664-b33b-318fd3a3dc74-kube-api-access-mhlzm\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.283732 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.283770 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.283845 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-scripts\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.284628 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13099ea1-6af0-4664-b33b-318fd3a3dc74-run-httpd\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.284672 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13099ea1-6af0-4664-b33b-318fd3a3dc74-log-httpd\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.289581 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-config-data\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.290971 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.294579 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.297568 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-scripts\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.301265 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.303466 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhlzm\" (UniqueName: \"kubernetes.io/projected/13099ea1-6af0-4664-b33b-318fd3a3dc74-kube-api-access-mhlzm\") pod \"ceilometer-0\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.358079 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9beda06-9357-4e94-a243-78484ede0b97" path="/var/lib/kubelet/pods/a9beda06-9357-4e94-a243-78484ede0b97/volumes" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.403548 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:26:42 crc kubenswrapper[4776]: I1208 09:26:42.996935 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:43 crc kubenswrapper[4776]: W1208 09:26:43.012258 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13099ea1_6af0_4664_b33b_318fd3a3dc74.slice/crio-479dcb10388e32ca40b43df003227e812884e5bb7b2b3a54a3027b7a15bf39d7 WatchSource:0}: Error finding container 479dcb10388e32ca40b43df003227e812884e5bb7b2b3a54a3027b7a15bf39d7: Status 404 returned error can't find the container with id 479dcb10388e32ca40b43df003227e812884e5bb7b2b3a54a3027b7a15bf39d7 Dec 08 09:26:43 crc kubenswrapper[4776]: I1208 09:26:43.016862 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 09:26:43 crc kubenswrapper[4776]: I1208 09:26:43.343636 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:26:43 crc kubenswrapper[4776]: E1208 09:26:43.344018 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:26:44 crc kubenswrapper[4776]: I1208 09:26:44.003119 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13099ea1-6af0-4664-b33b-318fd3a3dc74","Type":"ContainerStarted","Data":"32955a3a389cc56dbfa9f90dba12ccec5081fa179697c64cee8f5086b7208ecf"} Dec 08 09:26:44 crc kubenswrapper[4776]: I1208 09:26:44.003673 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13099ea1-6af0-4664-b33b-318fd3a3dc74","Type":"ContainerStarted","Data":"479dcb10388e32ca40b43df003227e812884e5bb7b2b3a54a3027b7a15bf39d7"} Dec 08 09:26:45 crc kubenswrapper[4776]: I1208 09:26:45.017252 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13099ea1-6af0-4664-b33b-318fd3a3dc74","Type":"ContainerStarted","Data":"1a1a8eaa53949ea4e744335ad92ac7e6568e7c38311efb8de51a30a5349a636a"} Dec 08 09:26:46 crc kubenswrapper[4776]: I1208 09:26:46.028940 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13099ea1-6af0-4664-b33b-318fd3a3dc74","Type":"ContainerStarted","Data":"fa3cde7551ebec08916dab28b22cb9736a7f228604f3326d3f2a3b8145524b70"} Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.051592 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13099ea1-6af0-4664-b33b-318fd3a3dc74","Type":"ContainerStarted","Data":"a33f4b63c5de4bd7f723d722ea8b912efe698882e3cf26aa110c5daa218f2232"} Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.052308 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.091570 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.717998269 podStartE2EDuration="5.091548499s" podCreationTimestamp="2025-12-08 09:26:42 +0000 UTC" firstStartedPulling="2025-12-08 09:26:43.016431774 +0000 UTC m=+1679.279656796" lastFinishedPulling="2025-12-08 09:26:46.389982004 +0000 UTC m=+1682.653207026" observedRunningTime="2025-12-08 09:26:47.081548701 +0000 UTC m=+1683.344773713" watchObservedRunningTime="2025-12-08 09:26:47.091548499 +0000 UTC m=+1683.354773521" Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.542111 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-zhhw6"] Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.551530 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-zhhw6"] Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.648768 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-l6w2n"] Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.650775 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-l6w2n" Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.661400 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-l6w2n"] Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.814744 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fqcb\" (UniqueName: \"kubernetes.io/projected/cd51a3a4-205b-4844-81db-439c7e1f0624-kube-api-access-9fqcb\") pod \"heat-db-sync-l6w2n\" (UID: \"cd51a3a4-205b-4844-81db-439c7e1f0624\") " pod="openstack/heat-db-sync-l6w2n" Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.814811 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd51a3a4-205b-4844-81db-439c7e1f0624-config-data\") pod \"heat-db-sync-l6w2n\" (UID: \"cd51a3a4-205b-4844-81db-439c7e1f0624\") " pod="openstack/heat-db-sync-l6w2n" Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.814851 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd51a3a4-205b-4844-81db-439c7e1f0624-combined-ca-bundle\") pod \"heat-db-sync-l6w2n\" (UID: \"cd51a3a4-205b-4844-81db-439c7e1f0624\") " pod="openstack/heat-db-sync-l6w2n" Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.917323 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fqcb\" (UniqueName: \"kubernetes.io/projected/cd51a3a4-205b-4844-81db-439c7e1f0624-kube-api-access-9fqcb\") pod \"heat-db-sync-l6w2n\" (UID: \"cd51a3a4-205b-4844-81db-439c7e1f0624\") " pod="openstack/heat-db-sync-l6w2n" Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.917390 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd51a3a4-205b-4844-81db-439c7e1f0624-config-data\") pod \"heat-db-sync-l6w2n\" (UID: \"cd51a3a4-205b-4844-81db-439c7e1f0624\") " pod="openstack/heat-db-sync-l6w2n" Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.917431 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd51a3a4-205b-4844-81db-439c7e1f0624-combined-ca-bundle\") pod \"heat-db-sync-l6w2n\" (UID: \"cd51a3a4-205b-4844-81db-439c7e1f0624\") " pod="openstack/heat-db-sync-l6w2n" Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.925855 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd51a3a4-205b-4844-81db-439c7e1f0624-combined-ca-bundle\") pod \"heat-db-sync-l6w2n\" (UID: \"cd51a3a4-205b-4844-81db-439c7e1f0624\") " pod="openstack/heat-db-sync-l6w2n" Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.926111 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd51a3a4-205b-4844-81db-439c7e1f0624-config-data\") pod \"heat-db-sync-l6w2n\" (UID: \"cd51a3a4-205b-4844-81db-439c7e1f0624\") " pod="openstack/heat-db-sync-l6w2n" Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.934994 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fqcb\" (UniqueName: \"kubernetes.io/projected/cd51a3a4-205b-4844-81db-439c7e1f0624-kube-api-access-9fqcb\") pod \"heat-db-sync-l6w2n\" (UID: \"cd51a3a4-205b-4844-81db-439c7e1f0624\") " pod="openstack/heat-db-sync-l6w2n" Dec 08 09:26:47 crc kubenswrapper[4776]: I1208 09:26:47.968204 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-l6w2n" Dec 08 09:26:48 crc kubenswrapper[4776]: I1208 09:26:48.357008 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4a95eb4-92c1-4eff-940b-37f74dd3dc18" path="/var/lib/kubelet/pods/f4a95eb4-92c1-4eff-940b-37f74dd3dc18/volumes" Dec 08 09:26:48 crc kubenswrapper[4776]: I1208 09:26:48.482855 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-l6w2n"] Dec 08 09:26:48 crc kubenswrapper[4776]: W1208 09:26:48.489467 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd51a3a4_205b_4844_81db_439c7e1f0624.slice/crio-5ebd1568b457a27974f4c4735c765ba91e84386efb324952908953814a9372c0 WatchSource:0}: Error finding container 5ebd1568b457a27974f4c4735c765ba91e84386efb324952908953814a9372c0: Status 404 returned error can't find the container with id 5ebd1568b457a27974f4c4735c765ba91e84386efb324952908953814a9372c0 Dec 08 09:26:49 crc kubenswrapper[4776]: I1208 09:26:49.083357 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-l6w2n" event={"ID":"cd51a3a4-205b-4844-81db-439c7e1f0624","Type":"ContainerStarted","Data":"5ebd1568b457a27974f4c4735c765ba91e84386efb324952908953814a9372c0"} Dec 08 09:26:49 crc kubenswrapper[4776]: I1208 09:26:49.346775 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 08 09:26:50 crc kubenswrapper[4776]: I1208 09:26:50.135525 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 09:26:51 crc kubenswrapper[4776]: I1208 09:26:51.034660 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:51 crc kubenswrapper[4776]: I1208 09:26:51.035216 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerName="ceilometer-central-agent" containerID="cri-o://32955a3a389cc56dbfa9f90dba12ccec5081fa179697c64cee8f5086b7208ecf" gracePeriod=30 Dec 08 09:26:51 crc kubenswrapper[4776]: I1208 09:26:51.035676 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerName="proxy-httpd" containerID="cri-o://a33f4b63c5de4bd7f723d722ea8b912efe698882e3cf26aa110c5daa218f2232" gracePeriod=30 Dec 08 09:26:51 crc kubenswrapper[4776]: I1208 09:26:51.035726 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerName="sg-core" containerID="cri-o://fa3cde7551ebec08916dab28b22cb9736a7f228604f3326d3f2a3b8145524b70" gracePeriod=30 Dec 08 09:26:51 crc kubenswrapper[4776]: I1208 09:26:51.035760 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerName="ceilometer-notification-agent" containerID="cri-o://1a1a8eaa53949ea4e744335ad92ac7e6568e7c38311efb8de51a30a5349a636a" gracePeriod=30 Dec 08 09:26:51 crc kubenswrapper[4776]: I1208 09:26:51.338668 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.175681 4776 generic.go:334] "Generic (PLEG): container finished" podID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerID="a33f4b63c5de4bd7f723d722ea8b912efe698882e3cf26aa110c5daa218f2232" exitCode=0 Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.176124 4776 generic.go:334] "Generic (PLEG): container finished" podID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerID="fa3cde7551ebec08916dab28b22cb9736a7f228604f3326d3f2a3b8145524b70" exitCode=2 Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.176134 4776 generic.go:334] "Generic (PLEG): container finished" podID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerID="1a1a8eaa53949ea4e744335ad92ac7e6568e7c38311efb8de51a30a5349a636a" exitCode=0 Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.176144 4776 generic.go:334] "Generic (PLEG): container finished" podID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerID="32955a3a389cc56dbfa9f90dba12ccec5081fa179697c64cee8f5086b7208ecf" exitCode=0 Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.176164 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13099ea1-6af0-4664-b33b-318fd3a3dc74","Type":"ContainerDied","Data":"a33f4b63c5de4bd7f723d722ea8b912efe698882e3cf26aa110c5daa218f2232"} Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.176213 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13099ea1-6af0-4664-b33b-318fd3a3dc74","Type":"ContainerDied","Data":"fa3cde7551ebec08916dab28b22cb9736a7f228604f3326d3f2a3b8145524b70"} Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.176239 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13099ea1-6af0-4664-b33b-318fd3a3dc74","Type":"ContainerDied","Data":"1a1a8eaa53949ea4e744335ad92ac7e6568e7c38311efb8de51a30a5349a636a"} Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.176250 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13099ea1-6af0-4664-b33b-318fd3a3dc74","Type":"ContainerDied","Data":"32955a3a389cc56dbfa9f90dba12ccec5081fa179697c64cee8f5086b7208ecf"} Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.261619 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.338776 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13099ea1-6af0-4664-b33b-318fd3a3dc74-log-httpd\") pod \"13099ea1-6af0-4664-b33b-318fd3a3dc74\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.338937 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-ceilometer-tls-certs\") pod \"13099ea1-6af0-4664-b33b-318fd3a3dc74\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.339011 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-scripts\") pod \"13099ea1-6af0-4664-b33b-318fd3a3dc74\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.339082 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-config-data\") pod \"13099ea1-6af0-4664-b33b-318fd3a3dc74\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.339189 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13099ea1-6af0-4664-b33b-318fd3a3dc74-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "13099ea1-6af0-4664-b33b-318fd3a3dc74" (UID: "13099ea1-6af0-4664-b33b-318fd3a3dc74"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.339203 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhlzm\" (UniqueName: \"kubernetes.io/projected/13099ea1-6af0-4664-b33b-318fd3a3dc74-kube-api-access-mhlzm\") pod \"13099ea1-6af0-4664-b33b-318fd3a3dc74\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.339345 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13099ea1-6af0-4664-b33b-318fd3a3dc74-run-httpd\") pod \"13099ea1-6af0-4664-b33b-318fd3a3dc74\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.339500 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-sg-core-conf-yaml\") pod \"13099ea1-6af0-4664-b33b-318fd3a3dc74\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.339584 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-combined-ca-bundle\") pod \"13099ea1-6af0-4664-b33b-318fd3a3dc74\" (UID: \"13099ea1-6af0-4664-b33b-318fd3a3dc74\") " Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.340329 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13099ea1-6af0-4664-b33b-318fd3a3dc74-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.349945 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13099ea1-6af0-4664-b33b-318fd3a3dc74-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "13099ea1-6af0-4664-b33b-318fd3a3dc74" (UID: "13099ea1-6af0-4664-b33b-318fd3a3dc74"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.355686 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13099ea1-6af0-4664-b33b-318fd3a3dc74-kube-api-access-mhlzm" (OuterVolumeSpecName: "kube-api-access-mhlzm") pod "13099ea1-6af0-4664-b33b-318fd3a3dc74" (UID: "13099ea1-6af0-4664-b33b-318fd3a3dc74"). InnerVolumeSpecName "kube-api-access-mhlzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.355673 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-scripts" (OuterVolumeSpecName: "scripts") pod "13099ea1-6af0-4664-b33b-318fd3a3dc74" (UID: "13099ea1-6af0-4664-b33b-318fd3a3dc74"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.427955 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "13099ea1-6af0-4664-b33b-318fd3a3dc74" (UID: "13099ea1-6af0-4664-b33b-318fd3a3dc74"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.443680 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhlzm\" (UniqueName: \"kubernetes.io/projected/13099ea1-6af0-4664-b33b-318fd3a3dc74-kube-api-access-mhlzm\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.443714 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13099ea1-6af0-4664-b33b-318fd3a3dc74-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.443724 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.443733 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.485141 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "13099ea1-6af0-4664-b33b-318fd3a3dc74" (UID: "13099ea1-6af0-4664-b33b-318fd3a3dc74"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.545414 4776 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.566344 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13099ea1-6af0-4664-b33b-318fd3a3dc74" (UID: "13099ea1-6af0-4664-b33b-318fd3a3dc74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.582801 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-config-data" (OuterVolumeSpecName: "config-data") pod "13099ea1-6af0-4664-b33b-318fd3a3dc74" (UID: "13099ea1-6af0-4664-b33b-318fd3a3dc74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.648020 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:52 crc kubenswrapper[4776]: I1208 09:26:52.648057 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13099ea1-6af0-4664-b33b-318fd3a3dc74-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.193452 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13099ea1-6af0-4664-b33b-318fd3a3dc74","Type":"ContainerDied","Data":"479dcb10388e32ca40b43df003227e812884e5bb7b2b3a54a3027b7a15bf39d7"} Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.193504 4776 scope.go:117] "RemoveContainer" containerID="a33f4b63c5de4bd7f723d722ea8b912efe698882e3cf26aa110c5daa218f2232" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.193670 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.312122 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.326681 4776 scope.go:117] "RemoveContainer" containerID="fa3cde7551ebec08916dab28b22cb9736a7f228604f3326d3f2a3b8145524b70" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.342112 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.364632 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:53 crc kubenswrapper[4776]: E1208 09:26:53.365159 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerName="sg-core" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.365194 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerName="sg-core" Dec 08 09:26:53 crc kubenswrapper[4776]: E1208 09:26:53.365249 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerName="ceilometer-central-agent" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.365257 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerName="ceilometer-central-agent" Dec 08 09:26:53 crc kubenswrapper[4776]: E1208 09:26:53.365267 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerName="ceilometer-notification-agent" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.365275 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerName="ceilometer-notification-agent" Dec 08 09:26:53 crc kubenswrapper[4776]: E1208 09:26:53.365295 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerName="proxy-httpd" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.365301 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerName="proxy-httpd" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.371943 4776 scope.go:117] "RemoveContainer" containerID="1a1a8eaa53949ea4e744335ad92ac7e6568e7c38311efb8de51a30a5349a636a" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.372305 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerName="sg-core" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.372366 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerName="ceilometer-notification-agent" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.372402 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerName="proxy-httpd" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.372426 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" containerName="ceilometer-central-agent" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.379819 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.386162 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.387231 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.387256 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.413069 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.433548 4776 scope.go:117] "RemoveContainer" containerID="32955a3a389cc56dbfa9f90dba12ccec5081fa179697c64cee8f5086b7208ecf" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.466617 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7cf1c3e-6789-4ccd-894c-946f056f2d96-scripts\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.466707 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7cf1c3e-6789-4ccd-894c-946f056f2d96-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.466796 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7cf1c3e-6789-4ccd-894c-946f056f2d96-config-data\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.466823 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cf1c3e-6789-4ccd-894c-946f056f2d96-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.466900 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7cf1c3e-6789-4ccd-894c-946f056f2d96-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.466935 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7cf1c3e-6789-4ccd-894c-946f056f2d96-run-httpd\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.466957 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7cf1c3e-6789-4ccd-894c-946f056f2d96-log-httpd\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.466982 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fnrx\" (UniqueName: \"kubernetes.io/projected/e7cf1c3e-6789-4ccd-894c-946f056f2d96-kube-api-access-8fnrx\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.569828 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7cf1c3e-6789-4ccd-894c-946f056f2d96-config-data\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.569919 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cf1c3e-6789-4ccd-894c-946f056f2d96-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.570012 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7cf1c3e-6789-4ccd-894c-946f056f2d96-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.570036 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7cf1c3e-6789-4ccd-894c-946f056f2d96-run-httpd\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.570058 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7cf1c3e-6789-4ccd-894c-946f056f2d96-log-httpd\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.570079 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fnrx\" (UniqueName: \"kubernetes.io/projected/e7cf1c3e-6789-4ccd-894c-946f056f2d96-kube-api-access-8fnrx\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.570103 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7cf1c3e-6789-4ccd-894c-946f056f2d96-scripts\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.570532 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7cf1c3e-6789-4ccd-894c-946f056f2d96-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.570667 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7cf1c3e-6789-4ccd-894c-946f056f2d96-run-httpd\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.570716 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7cf1c3e-6789-4ccd-894c-946f056f2d96-log-httpd\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.576906 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7cf1c3e-6789-4ccd-894c-946f056f2d96-scripts\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.591050 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7cf1c3e-6789-4ccd-894c-946f056f2d96-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.594985 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cf1c3e-6789-4ccd-894c-946f056f2d96-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.595643 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7cf1c3e-6789-4ccd-894c-946f056f2d96-config-data\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.596307 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7cf1c3e-6789-4ccd-894c-946f056f2d96-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.596516 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fnrx\" (UniqueName: \"kubernetes.io/projected/e7cf1c3e-6789-4ccd-894c-946f056f2d96-kube-api-access-8fnrx\") pod \"ceilometer-0\" (UID: \"e7cf1c3e-6789-4ccd-894c-946f056f2d96\") " pod="openstack/ceilometer-0" Dec 08 09:26:53 crc kubenswrapper[4776]: I1208 09:26:53.726146 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 09:26:54 crc kubenswrapper[4776]: I1208 09:26:54.268345 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 09:26:54 crc kubenswrapper[4776]: W1208 09:26:54.271221 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7cf1c3e_6789_4ccd_894c_946f056f2d96.slice/crio-e1bc62361859d6ef7803265cd9fd8aeeecc7f500b937a266ef3ccfb193a724b7 WatchSource:0}: Error finding container e1bc62361859d6ef7803265cd9fd8aeeecc7f500b937a266ef3ccfb193a724b7: Status 404 returned error can't find the container with id e1bc62361859d6ef7803265cd9fd8aeeecc7f500b937a266ef3ccfb193a724b7 Dec 08 09:26:54 crc kubenswrapper[4776]: I1208 09:26:54.357541 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13099ea1-6af0-4664-b33b-318fd3a3dc74" path="/var/lib/kubelet/pods/13099ea1-6af0-4664-b33b-318fd3a3dc74/volumes" Dec 08 09:26:54 crc kubenswrapper[4776]: I1208 09:26:54.973390 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" containerName="rabbitmq" containerID="cri-o://44a9f7ec71ea62b7d079dba7205175959bf86b790c3ffbd0bd9ec7d8f84229bb" gracePeriod=604796 Dec 08 09:26:55 crc kubenswrapper[4776]: I1208 09:26:55.126264 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.128:5671: connect: connection refused" Dec 08 09:26:55 crc kubenswrapper[4776]: I1208 09:26:55.229555 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7cf1c3e-6789-4ccd-894c-946f056f2d96","Type":"ContainerStarted","Data":"e1bc62361859d6ef7803265cd9fd8aeeecc7f500b937a266ef3ccfb193a724b7"} Dec 08 09:26:56 crc kubenswrapper[4776]: I1208 09:26:56.260863 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a01574f0-d8c8-404a-b822-7ce8e0af6fd4" containerName="rabbitmq" containerID="cri-o://2890fe37e7829396223b419f85f8fdf135c4324b82cd2bab822c2cb2fb7fbd3a" gracePeriod=604796 Dec 08 09:26:58 crc kubenswrapper[4776]: I1208 09:26:58.344155 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:26:58 crc kubenswrapper[4776]: E1208 09:26:58.345001 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:27:02 crc kubenswrapper[4776]: I1208 09:27:02.366956 4776 generic.go:334] "Generic (PLEG): container finished" podID="bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" containerID="44a9f7ec71ea62b7d079dba7205175959bf86b790c3ffbd0bd9ec7d8f84229bb" exitCode=0 Dec 08 09:27:02 crc kubenswrapper[4776]: I1208 09:27:02.367066 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994","Type":"ContainerDied","Data":"44a9f7ec71ea62b7d079dba7205175959bf86b790c3ffbd0bd9ec7d8f84229bb"} Dec 08 09:27:03 crc kubenswrapper[4776]: I1208 09:27:03.385192 4776 generic.go:334] "Generic (PLEG): container finished" podID="a01574f0-d8c8-404a-b822-7ce8e0af6fd4" containerID="2890fe37e7829396223b419f85f8fdf135c4324b82cd2bab822c2cb2fb7fbd3a" exitCode=0 Dec 08 09:27:03 crc kubenswrapper[4776]: I1208 09:27:03.385641 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a01574f0-d8c8-404a-b822-7ce8e0af6fd4","Type":"ContainerDied","Data":"2890fe37e7829396223b419f85f8fdf135c4324b82cd2bab822c2cb2fb7fbd3a"} Dec 08 09:27:04 crc kubenswrapper[4776]: I1208 09:27:04.846319 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a01574f0-d8c8-404a-b822-7ce8e0af6fd4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.129:5671: connect: connection refused" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.036909 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-4w5k5"] Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.043928 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.046083 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.057673 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-4w5k5"] Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.098439 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.098512 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.098586 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.098648 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.098696 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-config\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.098829 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.098866 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktpv2\" (UniqueName: \"kubernetes.io/projected/1f23b8d9-87af-4550-9bb5-3668b935c6b7-kube-api-access-ktpv2\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.126434 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.128:5671: connect: connection refused" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.200994 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.201338 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.201466 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.201580 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-config\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.201869 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.202509 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.202342 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-config\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.202100 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.202408 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.202744 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktpv2\" (UniqueName: \"kubernetes.io/projected/1f23b8d9-87af-4550-9bb5-3668b935c6b7-kube-api-access-ktpv2\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.203071 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.203410 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.204010 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.222377 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktpv2\" (UniqueName: \"kubernetes.io/projected/1f23b8d9-87af-4550-9bb5-3668b935c6b7-kube-api-access-ktpv2\") pod \"dnsmasq-dns-7d84b4d45c-4w5k5\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:05 crc kubenswrapper[4776]: I1208 09:27:05.416474 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:12 crc kubenswrapper[4776]: I1208 09:27:12.905098 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:12 crc kubenswrapper[4776]: I1208 09:27:12.997809 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzbxl\" (UniqueName: \"kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-kube-api-access-wzbxl\") pod \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " Dec 08 09:27:12 crc kubenswrapper[4776]: I1208 09:27:12.997890 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-plugins\") pod \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " Dec 08 09:27:12 crc kubenswrapper[4776]: I1208 09:27:12.997926 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-server-conf\") pod \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " Dec 08 09:27:12 crc kubenswrapper[4776]: I1208 09:27:12.998010 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-tls\") pod \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " Dec 08 09:27:12 crc kubenswrapper[4776]: I1208 09:27:12.998096 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-pod-info\") pod \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " Dec 08 09:27:12 crc kubenswrapper[4776]: I1208 09:27:12.998146 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-erlang-cookie\") pod \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " Dec 08 09:27:12 crc kubenswrapper[4776]: I1208 09:27:12.998228 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " Dec 08 09:27:12 crc kubenswrapper[4776]: I1208 09:27:12.999387 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a01574f0-d8c8-404a-b822-7ce8e0af6fd4" (UID: "a01574f0-d8c8-404a-b822-7ce8e0af6fd4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.005717 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "a01574f0-d8c8-404a-b822-7ce8e0af6fd4" (UID: "a01574f0-d8c8-404a-b822-7ce8e0af6fd4"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.006489 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-pod-info" (OuterVolumeSpecName: "pod-info") pod "a01574f0-d8c8-404a-b822-7ce8e0af6fd4" (UID: "a01574f0-d8c8-404a-b822-7ce8e0af6fd4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.005958 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-kube-api-access-wzbxl" (OuterVolumeSpecName: "kube-api-access-wzbxl") pod "a01574f0-d8c8-404a-b822-7ce8e0af6fd4" (UID: "a01574f0-d8c8-404a-b822-7ce8e0af6fd4"). InnerVolumeSpecName "kube-api-access-wzbxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.007508 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a01574f0-d8c8-404a-b822-7ce8e0af6fd4" (UID: "a01574f0-d8c8-404a-b822-7ce8e0af6fd4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.018059 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a01574f0-d8c8-404a-b822-7ce8e0af6fd4" (UID: "a01574f0-d8c8-404a-b822-7ce8e0af6fd4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:12.998275 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-confd\") pod \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.018897 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-config-data\") pod \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.019001 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-plugins-conf\") pod \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.019085 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-erlang-cookie-secret\") pod \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\" (UID: \"a01574f0-d8c8-404a-b822-7ce8e0af6fd4\") " Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.020024 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzbxl\" (UniqueName: \"kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-kube-api-access-wzbxl\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.020040 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.020048 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.020058 4776 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-pod-info\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.020067 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.020088 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.021353 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a01574f0-d8c8-404a-b822-7ce8e0af6fd4" (UID: "a01574f0-d8c8-404a-b822-7ce8e0af6fd4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.029984 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a01574f0-d8c8-404a-b822-7ce8e0af6fd4" (UID: "a01574f0-d8c8-404a-b822-7ce8e0af6fd4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.066611 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.112798 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-config-data" (OuterVolumeSpecName: "config-data") pod "a01574f0-d8c8-404a-b822-7ce8e0af6fd4" (UID: "a01574f0-d8c8-404a-b822-7ce8e0af6fd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.125730 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.125764 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.125775 4776 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.125784 4776 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.125807 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-server-conf" (OuterVolumeSpecName: "server-conf") pod "a01574f0-d8c8-404a-b822-7ce8e0af6fd4" (UID: "a01574f0-d8c8-404a-b822-7ce8e0af6fd4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.207785 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a01574f0-d8c8-404a-b822-7ce8e0af6fd4" (UID: "a01574f0-d8c8-404a-b822-7ce8e0af6fd4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.228614 4776 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-server-conf\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.228640 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a01574f0-d8c8-404a-b822-7ce8e0af6fd4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.344816 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:27:13 crc kubenswrapper[4776]: E1208 09:27:13.345036 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:27:13 crc kubenswrapper[4776]: E1208 09:27:13.432910 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Dec 08 09:27:13 crc kubenswrapper[4776]: E1208 09:27:13.432961 4776 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Dec 08 09:27:13 crc kubenswrapper[4776]: E1208 09:27:13.433082 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9fqcb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-l6w2n_openstack(cd51a3a4-205b-4844-81db-439c7e1f0624): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:27:13 crc kubenswrapper[4776]: E1208 09:27:13.434136 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-l6w2n" podUID="cd51a3a4-205b-4844-81db-439c7e1f0624" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.501407 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.501371 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a01574f0-d8c8-404a-b822-7ce8e0af6fd4","Type":"ContainerDied","Data":"72226f35dd79fbed00fd55d2c8fc6c7e59294d88693b0a6062f98e065f138be9"} Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.501607 4776 scope.go:117] "RemoveContainer" containerID="2890fe37e7829396223b419f85f8fdf135c4324b82cd2bab822c2cb2fb7fbd3a" Dec 08 09:27:13 crc kubenswrapper[4776]: E1208 09:27:13.504193 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-l6w2n" podUID="cd51a3a4-205b-4844-81db-439c7e1f0624" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.561752 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.575273 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.590392 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 09:27:13 crc kubenswrapper[4776]: E1208 09:27:13.591029 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01574f0-d8c8-404a-b822-7ce8e0af6fd4" containerName="setup-container" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.591054 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01574f0-d8c8-404a-b822-7ce8e0af6fd4" containerName="setup-container" Dec 08 09:27:13 crc kubenswrapper[4776]: E1208 09:27:13.591124 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01574f0-d8c8-404a-b822-7ce8e0af6fd4" containerName="rabbitmq" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.591135 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01574f0-d8c8-404a-b822-7ce8e0af6fd4" containerName="rabbitmq" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.591413 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a01574f0-d8c8-404a-b822-7ce8e0af6fd4" containerName="rabbitmq" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.592757 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.597601 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-42dx6" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.597893 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.598048 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.598249 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.598370 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.598450 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.598540 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.599994 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.739875 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87931091-7230-4451-9d94-20ac4b8458bc-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.740010 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/87931091-7230-4451-9d94-20ac4b8458bc-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.740101 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/87931091-7230-4451-9d94-20ac4b8458bc-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.740207 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/87931091-7230-4451-9d94-20ac4b8458bc-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.740239 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.740275 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/87931091-7230-4451-9d94-20ac4b8458bc-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.740382 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hzqd\" (UniqueName: \"kubernetes.io/projected/87931091-7230-4451-9d94-20ac4b8458bc-kube-api-access-6hzqd\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.740558 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/87931091-7230-4451-9d94-20ac4b8458bc-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.740685 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/87931091-7230-4451-9d94-20ac4b8458bc-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.740972 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/87931091-7230-4451-9d94-20ac4b8458bc-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.741109 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/87931091-7230-4451-9d94-20ac4b8458bc-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.844299 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/87931091-7230-4451-9d94-20ac4b8458bc-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.844380 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/87931091-7230-4451-9d94-20ac4b8458bc-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.844418 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87931091-7230-4451-9d94-20ac4b8458bc-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.844443 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/87931091-7230-4451-9d94-20ac4b8458bc-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.844524 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/87931091-7230-4451-9d94-20ac4b8458bc-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.844583 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/87931091-7230-4451-9d94-20ac4b8458bc-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.844871 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.844909 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/87931091-7230-4451-9d94-20ac4b8458bc-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.844997 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/87931091-7230-4451-9d94-20ac4b8458bc-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.845341 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87931091-7230-4451-9d94-20ac4b8458bc-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.845410 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hzqd\" (UniqueName: \"kubernetes.io/projected/87931091-7230-4451-9d94-20ac4b8458bc-kube-api-access-6hzqd\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.845438 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.845459 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/87931091-7230-4451-9d94-20ac4b8458bc-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.845490 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/87931091-7230-4451-9d94-20ac4b8458bc-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.845561 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/87931091-7230-4451-9d94-20ac4b8458bc-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.845834 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/87931091-7230-4451-9d94-20ac4b8458bc-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.846374 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/87931091-7230-4451-9d94-20ac4b8458bc-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.850604 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/87931091-7230-4451-9d94-20ac4b8458bc-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.851040 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/87931091-7230-4451-9d94-20ac4b8458bc-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.851152 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/87931091-7230-4451-9d94-20ac4b8458bc-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.851199 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/87931091-7230-4451-9d94-20ac4b8458bc-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.864348 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hzqd\" (UniqueName: \"kubernetes.io/projected/87931091-7230-4451-9d94-20ac4b8458bc-kube-api-access-6hzqd\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.889343 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"87931091-7230-4451-9d94-20ac4b8458bc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.920606 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:13 crc kubenswrapper[4776]: I1208 09:27:13.934702 4776 scope.go:117] "RemoveContainer" containerID="b6fb7c3067a1dc9a57114d5f89cfbc05711c41bf27c557af79d1dbb9fcb89acd" Dec 08 09:27:13 crc kubenswrapper[4776]: E1208 09:27:13.947784 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Dec 08 09:27:13 crc kubenswrapper[4776]: E1208 09:27:13.947843 4776 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Dec 08 09:27:13 crc kubenswrapper[4776]: E1208 09:27:13.947974 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n78h699h88h66dh56bh65bh65dh674h669h695h649h556h8bh5bfh68fh5dch569hb4h567hf8h574h6fh585h5b6hb7h5bfhd7hc4h5cbhddhd7h667q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8fnrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(e7cf1c3e-6789-4ccd-894c-946f056f2d96): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.097021 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.260863 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-erlang-cookie\") pod \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.260921 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-pod-info\") pod \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.260950 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-config-data\") pod \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.261042 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-plugins-conf\") pod \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.261082 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.261136 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-plugins\") pod \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.261264 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-erlang-cookie-secret\") pod \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.261312 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t66m\" (UniqueName: \"kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-kube-api-access-7t66m\") pod \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.261338 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-server-conf\") pod \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.261568 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" (UID: "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.261570 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-confd\") pod \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.261642 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-tls\") pod \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.261996 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" (UID: "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.262456 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" (UID: "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.263299 4776 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.263315 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.263326 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.267306 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" (UID: "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.268147 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" (UID: "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.268486 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" (UID: "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.270873 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-pod-info" (OuterVolumeSpecName: "pod-info") pod "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" (UID: "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.271470 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-kube-api-access-7t66m" (OuterVolumeSpecName: "kube-api-access-7t66m") pod "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" (UID: "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994"). InnerVolumeSpecName "kube-api-access-7t66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.305468 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-config-data" (OuterVolumeSpecName: "config-data") pod "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" (UID: "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.333989 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-server-conf" (OuterVolumeSpecName: "server-conf") pod "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" (UID: "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.367128 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.367180 4776 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-pod-info\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.367191 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.367220 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.367230 4776 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.367239 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t66m\" (UniqueName: \"kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-kube-api-access-7t66m\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.367248 4776 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-server-conf\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.410712 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a01574f0-d8c8-404a-b822-7ce8e0af6fd4" path="/var/lib/kubelet/pods/a01574f0-d8c8-404a-b822-7ce8e0af6fd4/volumes" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.428534 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.474138 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" (UID: "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.478301 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-4w5k5"] Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.479037 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-confd\") pod \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\" (UID: \"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994\") " Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.479795 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:14 crc kubenswrapper[4776]: W1208 09:27:14.479891 4776 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994/volumes/kubernetes.io~projected/rabbitmq-confd Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.479947 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" (UID: "bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.513613 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" event={"ID":"1f23b8d9-87af-4550-9bb5-3668b935c6b7","Type":"ContainerStarted","Data":"70b27eb316efee88ec0d6996660d7101264005b2f481508af8f57e42814f8cf1"} Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.519268 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994","Type":"ContainerDied","Data":"cfce493ab73c718f447310fe516c8662a1077f8594cd914ec1ae9c90bb2656c2"} Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.519321 4776 scope.go:117] "RemoveContainer" containerID="44a9f7ec71ea62b7d079dba7205175959bf86b790c3ffbd0bd9ec7d8f84229bb" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.519450 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.581986 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.582623 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.602706 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.610227 4776 scope.go:117] "RemoveContainer" containerID="4c055d7aed43594abdf15af1327c7f77c2e5b1d61e62e8c3406e155a3f4672e3" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.656248 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.688766 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 09:27:14 crc kubenswrapper[4776]: E1208 09:27:14.689274 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" containerName="setup-container" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.689292 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" containerName="setup-container" Dec 08 09:27:14 crc kubenswrapper[4776]: E1208 09:27:14.689333 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" containerName="rabbitmq" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.689340 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" containerName="rabbitmq" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.689568 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" containerName="rabbitmq" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.697394 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.711480 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-cmx8g" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.711748 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.711920 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.712108 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.712198 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.712370 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.712607 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.712741 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.798287 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab6303ff-9104-40ed-babe-1445f4cd89e2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.798345 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab6303ff-9104-40ed-babe-1445f4cd89e2-config-data\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.798372 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpjv5\" (UniqueName: \"kubernetes.io/projected/ab6303ff-9104-40ed-babe-1445f4cd89e2-kube-api-access-jpjv5\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.798418 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab6303ff-9104-40ed-babe-1445f4cd89e2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.798439 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.798489 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab6303ff-9104-40ed-babe-1445f4cd89e2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.798511 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab6303ff-9104-40ed-babe-1445f4cd89e2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.798550 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab6303ff-9104-40ed-babe-1445f4cd89e2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.798599 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab6303ff-9104-40ed-babe-1445f4cd89e2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.798796 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab6303ff-9104-40ed-babe-1445f4cd89e2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.798844 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab6303ff-9104-40ed-babe-1445f4cd89e2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.902942 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab6303ff-9104-40ed-babe-1445f4cd89e2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.905241 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab6303ff-9104-40ed-babe-1445f4cd89e2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.905338 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab6303ff-9104-40ed-babe-1445f4cd89e2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.905268 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab6303ff-9104-40ed-babe-1445f4cd89e2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.906057 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab6303ff-9104-40ed-babe-1445f4cd89e2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.906185 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab6303ff-9104-40ed-babe-1445f4cd89e2-config-data\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.906242 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpjv5\" (UniqueName: \"kubernetes.io/projected/ab6303ff-9104-40ed-babe-1445f4cd89e2-kube-api-access-jpjv5\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.906349 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab6303ff-9104-40ed-babe-1445f4cd89e2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.906381 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.906505 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab6303ff-9104-40ed-babe-1445f4cd89e2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.906544 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab6303ff-9104-40ed-babe-1445f4cd89e2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.906626 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab6303ff-9104-40ed-babe-1445f4cd89e2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.906964 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab6303ff-9104-40ed-babe-1445f4cd89e2-config-data\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.907331 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.907938 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab6303ff-9104-40ed-babe-1445f4cd89e2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.908700 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab6303ff-9104-40ed-babe-1445f4cd89e2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.908985 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab6303ff-9104-40ed-babe-1445f4cd89e2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.910192 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab6303ff-9104-40ed-babe-1445f4cd89e2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.910255 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab6303ff-9104-40ed-babe-1445f4cd89e2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.910635 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab6303ff-9104-40ed-babe-1445f4cd89e2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.912097 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab6303ff-9104-40ed-babe-1445f4cd89e2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.931563 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpjv5\" (UniqueName: \"kubernetes.io/projected/ab6303ff-9104-40ed-babe-1445f4cd89e2-kube-api-access-jpjv5\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:14 crc kubenswrapper[4776]: I1208 09:27:14.949862 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"ab6303ff-9104-40ed-babe-1445f4cd89e2\") " pod="openstack/rabbitmq-server-0" Dec 08 09:27:15 crc kubenswrapper[4776]: I1208 09:27:15.037394 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 08 09:27:15 crc kubenswrapper[4776]: I1208 09:27:15.538309 4776 generic.go:334] "Generic (PLEG): container finished" podID="1f23b8d9-87af-4550-9bb5-3668b935c6b7" containerID="12d4a1cdc4a774aa398905cc714b5c65ffdbf7ed4e0805a23ca3427a1c2dba58" exitCode=0 Dec 08 09:27:15 crc kubenswrapper[4776]: I1208 09:27:15.538443 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" event={"ID":"1f23b8d9-87af-4550-9bb5-3668b935c6b7","Type":"ContainerDied","Data":"12d4a1cdc4a774aa398905cc714b5c65ffdbf7ed4e0805a23ca3427a1c2dba58"} Dec 08 09:27:15 crc kubenswrapper[4776]: I1208 09:27:15.542673 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"87931091-7230-4451-9d94-20ac4b8458bc","Type":"ContainerStarted","Data":"9d4f51031fbd60f1a1122c608c0493b6e33879d176a5feeedf5d14ae6b81c553"} Dec 08 09:27:15 crc kubenswrapper[4776]: I1208 09:27:15.548731 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7cf1c3e-6789-4ccd-894c-946f056f2d96","Type":"ContainerStarted","Data":"067b8a94680ef916d334704e86bddd2244957ec63d81ab4887379b26ca0336ea"} Dec 08 09:27:15 crc kubenswrapper[4776]: I1208 09:27:15.548778 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7cf1c3e-6789-4ccd-894c-946f056f2d96","Type":"ContainerStarted","Data":"2d900622e6b9bfbcc357aa3d6daeaba25d625c630058ead36b7aaab4c6b45522"} Dec 08 09:27:15 crc kubenswrapper[4776]: I1208 09:27:15.570308 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 09:27:15 crc kubenswrapper[4776]: W1208 09:27:15.577795 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab6303ff_9104_40ed_babe_1445f4cd89e2.slice/crio-22868439a03af52ab45edd05ca2b6c123c92d49ce7fbd815840e6b2474e8d363 WatchSource:0}: Error finding container 22868439a03af52ab45edd05ca2b6c123c92d49ce7fbd815840e6b2474e8d363: Status 404 returned error can't find the container with id 22868439a03af52ab45edd05ca2b6c123c92d49ce7fbd815840e6b2474e8d363 Dec 08 09:27:16 crc kubenswrapper[4776]: I1208 09:27:16.390987 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994" path="/var/lib/kubelet/pods/bc4ca0fd-48d8-4d6a-a7d6-2c0ad2d78994/volumes" Dec 08 09:27:16 crc kubenswrapper[4776]: I1208 09:27:16.562628 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" event={"ID":"1f23b8d9-87af-4550-9bb5-3668b935c6b7","Type":"ContainerStarted","Data":"6c561ec12c300804a72a4fc232f459a3589d54c852f47dd2e04bfa91ddc0770e"} Dec 08 09:27:16 crc kubenswrapper[4776]: I1208 09:27:16.563839 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:16 crc kubenswrapper[4776]: I1208 09:27:16.565385 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ab6303ff-9104-40ed-babe-1445f4cd89e2","Type":"ContainerStarted","Data":"22868439a03af52ab45edd05ca2b6c123c92d49ce7fbd815840e6b2474e8d363"} Dec 08 09:27:16 crc kubenswrapper[4776]: I1208 09:27:16.589713 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" podStartSLOduration=11.58969307 podStartE2EDuration="11.58969307s" podCreationTimestamp="2025-12-08 09:27:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:27:16.580607056 +0000 UTC m=+1712.843832078" watchObservedRunningTime="2025-12-08 09:27:16.58969307 +0000 UTC m=+1712.852918092" Dec 08 09:27:17 crc kubenswrapper[4776]: I1208 09:27:17.579199 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"87931091-7230-4451-9d94-20ac4b8458bc","Type":"ContainerStarted","Data":"11c9f9a0f1003ca3b3a35014c2bd8214596e154c9f7ded4722c2ab12857dc03f"} Dec 08 09:27:17 crc kubenswrapper[4776]: E1208 09:27:17.657120 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="e7cf1c3e-6789-4ccd-894c-946f056f2d96" Dec 08 09:27:18 crc kubenswrapper[4776]: I1208 09:27:18.592258 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ab6303ff-9104-40ed-babe-1445f4cd89e2","Type":"ContainerStarted","Data":"aa2fa07f20c60c99e5f3ea6f75b2b83cb7b5e204ecb6fb81fb11f41fead2d5ce"} Dec 08 09:27:18 crc kubenswrapper[4776]: I1208 09:27:18.597341 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7cf1c3e-6789-4ccd-894c-946f056f2d96","Type":"ContainerStarted","Data":"14aaa5f4b417e897bd1a0227d0180849fc6529f8e039b9523d258cffeb30c71e"} Dec 08 09:27:18 crc kubenswrapper[4776]: I1208 09:27:18.597718 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 09:27:18 crc kubenswrapper[4776]: E1208 09:27:18.599881 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="e7cf1c3e-6789-4ccd-894c-946f056f2d96" Dec 08 09:27:19 crc kubenswrapper[4776]: E1208 09:27:19.609117 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="e7cf1c3e-6789-4ccd-894c-946f056f2d96" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.421925 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.509326 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn"] Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.509662 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" podUID="9f6c0b05-3b5e-465b-8edc-f276b2ff8c42" containerName="dnsmasq-dns" containerID="cri-o://0341717a9debbbd6ed2e3f02feaee7c8666680fa5d4f9aa99ce1fdfeef78fb8c" gracePeriod=10 Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.675566 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-zwnwt"] Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.681529 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.696018 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-zwnwt"] Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.764959 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.765408 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-config\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.765533 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptrr8\" (UniqueName: \"kubernetes.io/projected/569f45d2-4634-4246-873e-939ec98a0baf-kube-api-access-ptrr8\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.765603 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.765690 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.765738 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.765758 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.867458 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.867557 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.867597 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.867614 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.867660 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.867702 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-config\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.867769 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptrr8\" (UniqueName: \"kubernetes.io/projected/569f45d2-4634-4246-873e-939ec98a0baf-kube-api-access-ptrr8\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.868995 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.869059 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-config\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.869111 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.869491 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.869681 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.870284 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/569f45d2-4634-4246-873e-939ec98a0baf-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:20 crc kubenswrapper[4776]: I1208 09:27:20.912084 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptrr8\" (UniqueName: \"kubernetes.io/projected/569f45d2-4634-4246-873e-939ec98a0baf-kube-api-access-ptrr8\") pod \"dnsmasq-dns-6f6df4f56c-zwnwt\" (UID: \"569f45d2-4634-4246-873e-939ec98a0baf\") " pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.018993 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.221350 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.288448 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-dns-svc\") pod \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.288533 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-config\") pod \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.288589 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-ovsdbserver-nb\") pod \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.288640 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-dns-swift-storage-0\") pod \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.288690 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-ovsdbserver-sb\") pod \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.288761 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s64h8\" (UniqueName: \"kubernetes.io/projected/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-kube-api-access-s64h8\") pod \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\" (UID: \"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42\") " Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.316902 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-kube-api-access-s64h8" (OuterVolumeSpecName: "kube-api-access-s64h8") pod "9f6c0b05-3b5e-465b-8edc-f276b2ff8c42" (UID: "9f6c0b05-3b5e-465b-8edc-f276b2ff8c42"). InnerVolumeSpecName "kube-api-access-s64h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.377111 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-config" (OuterVolumeSpecName: "config") pod "9f6c0b05-3b5e-465b-8edc-f276b2ff8c42" (UID: "9f6c0b05-3b5e-465b-8edc-f276b2ff8c42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.392286 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s64h8\" (UniqueName: \"kubernetes.io/projected/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-kube-api-access-s64h8\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.392312 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.397086 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9f6c0b05-3b5e-465b-8edc-f276b2ff8c42" (UID: "9f6c0b05-3b5e-465b-8edc-f276b2ff8c42"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.406636 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9f6c0b05-3b5e-465b-8edc-f276b2ff8c42" (UID: "9f6c0b05-3b5e-465b-8edc-f276b2ff8c42"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.415194 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f6c0b05-3b5e-465b-8edc-f276b2ff8c42" (UID: "9f6c0b05-3b5e-465b-8edc-f276b2ff8c42"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.441295 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9f6c0b05-3b5e-465b-8edc-f276b2ff8c42" (UID: "9f6c0b05-3b5e-465b-8edc-f276b2ff8c42"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.493954 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.494254 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.494264 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.494273 4776 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.646132 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-zwnwt"] Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.666365 4776 generic.go:334] "Generic (PLEG): container finished" podID="9f6c0b05-3b5e-465b-8edc-f276b2ff8c42" containerID="0341717a9debbbd6ed2e3f02feaee7c8666680fa5d4f9aa99ce1fdfeef78fb8c" exitCode=0 Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.666406 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" event={"ID":"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42","Type":"ContainerDied","Data":"0341717a9debbbd6ed2e3f02feaee7c8666680fa5d4f9aa99ce1fdfeef78fb8c"} Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.666434 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" event={"ID":"9f6c0b05-3b5e-465b-8edc-f276b2ff8c42","Type":"ContainerDied","Data":"51d3ee02d4b53d9166424cae686014239ebb0846e93c617f0113c453ef823857"} Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.666451 4776 scope.go:117] "RemoveContainer" containerID="0341717a9debbbd6ed2e3f02feaee7c8666680fa5d4f9aa99ce1fdfeef78fb8c" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.666566 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.835546 4776 scope.go:117] "RemoveContainer" containerID="40af2f431091aebd32f2fa532e0dde354c0315c8ecef6bc5bd3d8b323982e4f1" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.852602 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn"] Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.873939 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-rgtmn"] Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.894942 4776 scope.go:117] "RemoveContainer" containerID="0341717a9debbbd6ed2e3f02feaee7c8666680fa5d4f9aa99ce1fdfeef78fb8c" Dec 08 09:27:21 crc kubenswrapper[4776]: E1208 09:27:21.895420 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0341717a9debbbd6ed2e3f02feaee7c8666680fa5d4f9aa99ce1fdfeef78fb8c\": container with ID starting with 0341717a9debbbd6ed2e3f02feaee7c8666680fa5d4f9aa99ce1fdfeef78fb8c not found: ID does not exist" containerID="0341717a9debbbd6ed2e3f02feaee7c8666680fa5d4f9aa99ce1fdfeef78fb8c" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.895467 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0341717a9debbbd6ed2e3f02feaee7c8666680fa5d4f9aa99ce1fdfeef78fb8c"} err="failed to get container status \"0341717a9debbbd6ed2e3f02feaee7c8666680fa5d4f9aa99ce1fdfeef78fb8c\": rpc error: code = NotFound desc = could not find container \"0341717a9debbbd6ed2e3f02feaee7c8666680fa5d4f9aa99ce1fdfeef78fb8c\": container with ID starting with 0341717a9debbbd6ed2e3f02feaee7c8666680fa5d4f9aa99ce1fdfeef78fb8c not found: ID does not exist" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.895495 4776 scope.go:117] "RemoveContainer" containerID="40af2f431091aebd32f2fa532e0dde354c0315c8ecef6bc5bd3d8b323982e4f1" Dec 08 09:27:21 crc kubenswrapper[4776]: E1208 09:27:21.895895 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40af2f431091aebd32f2fa532e0dde354c0315c8ecef6bc5bd3d8b323982e4f1\": container with ID starting with 40af2f431091aebd32f2fa532e0dde354c0315c8ecef6bc5bd3d8b323982e4f1 not found: ID does not exist" containerID="40af2f431091aebd32f2fa532e0dde354c0315c8ecef6bc5bd3d8b323982e4f1" Dec 08 09:27:21 crc kubenswrapper[4776]: I1208 09:27:21.895930 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40af2f431091aebd32f2fa532e0dde354c0315c8ecef6bc5bd3d8b323982e4f1"} err="failed to get container status \"40af2f431091aebd32f2fa532e0dde354c0315c8ecef6bc5bd3d8b323982e4f1\": rpc error: code = NotFound desc = could not find container \"40af2f431091aebd32f2fa532e0dde354c0315c8ecef6bc5bd3d8b323982e4f1\": container with ID starting with 40af2f431091aebd32f2fa532e0dde354c0315c8ecef6bc5bd3d8b323982e4f1 not found: ID does not exist" Dec 08 09:27:22 crc kubenswrapper[4776]: I1208 09:27:22.370832 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f6c0b05-3b5e-465b-8edc-f276b2ff8c42" path="/var/lib/kubelet/pods/9f6c0b05-3b5e-465b-8edc-f276b2ff8c42/volumes" Dec 08 09:27:22 crc kubenswrapper[4776]: I1208 09:27:22.681216 4776 generic.go:334] "Generic (PLEG): container finished" podID="569f45d2-4634-4246-873e-939ec98a0baf" containerID="4e86b8a62647d2bdb5d0860ced307ebe09f7c252b4b6a31326dc59d34ba80295" exitCode=0 Dec 08 09:27:22 crc kubenswrapper[4776]: I1208 09:27:22.681267 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" event={"ID":"569f45d2-4634-4246-873e-939ec98a0baf","Type":"ContainerDied","Data":"4e86b8a62647d2bdb5d0860ced307ebe09f7c252b4b6a31326dc59d34ba80295"} Dec 08 09:27:22 crc kubenswrapper[4776]: I1208 09:27:22.681296 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" event={"ID":"569f45d2-4634-4246-873e-939ec98a0baf","Type":"ContainerStarted","Data":"960f731214959306f761be152e9b1b61a355391f3777308f215cf6f4e0c85103"} Dec 08 09:27:23 crc kubenswrapper[4776]: I1208 09:27:23.692110 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" event={"ID":"569f45d2-4634-4246-873e-939ec98a0baf","Type":"ContainerStarted","Data":"7e41ce18225749f689d3f50ea378a95c76572589acb27ec70625e8fdb028386c"} Dec 08 09:27:23 crc kubenswrapper[4776]: I1208 09:27:23.692662 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:23 crc kubenswrapper[4776]: I1208 09:27:23.710956 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" podStartSLOduration=3.710939604 podStartE2EDuration="3.710939604s" podCreationTimestamp="2025-12-08 09:27:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:27:23.709070005 +0000 UTC m=+1719.972295047" watchObservedRunningTime="2025-12-08 09:27:23.710939604 +0000 UTC m=+1719.974164626" Dec 08 09:27:27 crc kubenswrapper[4776]: I1208 09:27:27.344698 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:27:27 crc kubenswrapper[4776]: E1208 09:27:27.346564 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:27:28 crc kubenswrapper[4776]: I1208 09:27:28.764964 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-l6w2n" event={"ID":"cd51a3a4-205b-4844-81db-439c7e1f0624","Type":"ContainerStarted","Data":"5b68070f1ed673b2e4199f75c4fb22993b7055298e67af89b496a5083846f009"} Dec 08 09:27:28 crc kubenswrapper[4776]: I1208 09:27:28.784067 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-l6w2n" podStartSLOduration=2.71840056 podStartE2EDuration="41.784042663s" podCreationTimestamp="2025-12-08 09:26:47 +0000 UTC" firstStartedPulling="2025-12-08 09:26:48.491918796 +0000 UTC m=+1684.755143818" lastFinishedPulling="2025-12-08 09:27:27.557560889 +0000 UTC m=+1723.820785921" observedRunningTime="2025-12-08 09:27:28.782571864 +0000 UTC m=+1725.045796896" watchObservedRunningTime="2025-12-08 09:27:28.784042663 +0000 UTC m=+1725.047267725" Dec 08 09:27:30 crc kubenswrapper[4776]: I1208 09:27:30.363300 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 08 09:27:30 crc kubenswrapper[4776]: E1208 09:27:30.531861 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd51a3a4_205b_4844_81db_439c7e1f0624.slice/crio-conmon-5b68070f1ed673b2e4199f75c4fb22993b7055298e67af89b496a5083846f009.scope\": RecentStats: unable to find data in memory cache]" Dec 08 09:27:30 crc kubenswrapper[4776]: I1208 09:27:30.792754 4776 generic.go:334] "Generic (PLEG): container finished" podID="cd51a3a4-205b-4844-81db-439c7e1f0624" containerID="5b68070f1ed673b2e4199f75c4fb22993b7055298e67af89b496a5083846f009" exitCode=0 Dec 08 09:27:30 crc kubenswrapper[4776]: I1208 09:27:30.792967 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-l6w2n" event={"ID":"cd51a3a4-205b-4844-81db-439c7e1f0624","Type":"ContainerDied","Data":"5b68070f1ed673b2e4199f75c4fb22993b7055298e67af89b496a5083846f009"} Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.021388 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6df4f56c-zwnwt" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.142380 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-4w5k5"] Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.142631 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" podUID="1f23b8d9-87af-4550-9bb5-3668b935c6b7" containerName="dnsmasq-dns" containerID="cri-o://6c561ec12c300804a72a4fc232f459a3589d54c852f47dd2e04bfa91ddc0770e" gracePeriod=10 Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.716796 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.749416 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-ovsdbserver-sb\") pod \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.749477 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktpv2\" (UniqueName: \"kubernetes.io/projected/1f23b8d9-87af-4550-9bb5-3668b935c6b7-kube-api-access-ktpv2\") pod \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.749554 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-openstack-edpm-ipam\") pod \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.749575 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-config\") pod \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.749754 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-dns-swift-storage-0\") pod \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.749830 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-ovsdbserver-nb\") pod \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.749930 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-dns-svc\") pod \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\" (UID: \"1f23b8d9-87af-4550-9bb5-3668b935c6b7\") " Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.757520 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f23b8d9-87af-4550-9bb5-3668b935c6b7-kube-api-access-ktpv2" (OuterVolumeSpecName: "kube-api-access-ktpv2") pod "1f23b8d9-87af-4550-9bb5-3668b935c6b7" (UID: "1f23b8d9-87af-4550-9bb5-3668b935c6b7"). InnerVolumeSpecName "kube-api-access-ktpv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.814614 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7cf1c3e-6789-4ccd-894c-946f056f2d96","Type":"ContainerStarted","Data":"db748c0acc422f24e9cb2763e951ab939fa3aa0bbe3907fd5953f0158bc40f96"} Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.823122 4776 generic.go:334] "Generic (PLEG): container finished" podID="1f23b8d9-87af-4550-9bb5-3668b935c6b7" containerID="6c561ec12c300804a72a4fc232f459a3589d54c852f47dd2e04bfa91ddc0770e" exitCode=0 Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.823219 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.823269 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" event={"ID":"1f23b8d9-87af-4550-9bb5-3668b935c6b7","Type":"ContainerDied","Data":"6c561ec12c300804a72a4fc232f459a3589d54c852f47dd2e04bfa91ddc0770e"} Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.823296 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-4w5k5" event={"ID":"1f23b8d9-87af-4550-9bb5-3668b935c6b7","Type":"ContainerDied","Data":"70b27eb316efee88ec0d6996660d7101264005b2f481508af8f57e42814f8cf1"} Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.823312 4776 scope.go:117] "RemoveContainer" containerID="6c561ec12c300804a72a4fc232f459a3589d54c852f47dd2e04bfa91ddc0770e" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.840258 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1f23b8d9-87af-4550-9bb5-3668b935c6b7" (UID: "1f23b8d9-87af-4550-9bb5-3668b935c6b7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.854545 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.854572 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktpv2\" (UniqueName: \"kubernetes.io/projected/1f23b8d9-87af-4550-9bb5-3668b935c6b7-kube-api-access-ktpv2\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.855464 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "1f23b8d9-87af-4550-9bb5-3668b935c6b7" (UID: "1f23b8d9-87af-4550-9bb5-3668b935c6b7"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.856799 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f23b8d9-87af-4550-9bb5-3668b935c6b7" (UID: "1f23b8d9-87af-4550-9bb5-3668b935c6b7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.859408 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.5998523430000002 podStartE2EDuration="38.859385948s" podCreationTimestamp="2025-12-08 09:26:53 +0000 UTC" firstStartedPulling="2025-12-08 09:26:54.274361828 +0000 UTC m=+1690.537586850" lastFinishedPulling="2025-12-08 09:27:30.533895433 +0000 UTC m=+1726.797120455" observedRunningTime="2025-12-08 09:27:31.840934281 +0000 UTC m=+1728.104159323" watchObservedRunningTime="2025-12-08 09:27:31.859385948 +0000 UTC m=+1728.122610970" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.879503 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-config" (OuterVolumeSpecName: "config") pod "1f23b8d9-87af-4550-9bb5-3668b935c6b7" (UID: "1f23b8d9-87af-4550-9bb5-3668b935c6b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.882793 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1f23b8d9-87af-4550-9bb5-3668b935c6b7" (UID: "1f23b8d9-87af-4550-9bb5-3668b935c6b7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.895028 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1f23b8d9-87af-4550-9bb5-3668b935c6b7" (UID: "1f23b8d9-87af-4550-9bb5-3668b935c6b7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.963017 4776 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.963048 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.963060 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.963069 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.963077 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f23b8d9-87af-4550-9bb5-3668b935c6b7-config\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.965115 4776 scope.go:117] "RemoveContainer" containerID="12d4a1cdc4a774aa398905cc714b5c65ffdbf7ed4e0805a23ca3427a1c2dba58" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.993308 4776 scope.go:117] "RemoveContainer" containerID="6c561ec12c300804a72a4fc232f459a3589d54c852f47dd2e04bfa91ddc0770e" Dec 08 09:27:31 crc kubenswrapper[4776]: E1208 09:27:31.994075 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c561ec12c300804a72a4fc232f459a3589d54c852f47dd2e04bfa91ddc0770e\": container with ID starting with 6c561ec12c300804a72a4fc232f459a3589d54c852f47dd2e04bfa91ddc0770e not found: ID does not exist" containerID="6c561ec12c300804a72a4fc232f459a3589d54c852f47dd2e04bfa91ddc0770e" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.994112 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c561ec12c300804a72a4fc232f459a3589d54c852f47dd2e04bfa91ddc0770e"} err="failed to get container status \"6c561ec12c300804a72a4fc232f459a3589d54c852f47dd2e04bfa91ddc0770e\": rpc error: code = NotFound desc = could not find container \"6c561ec12c300804a72a4fc232f459a3589d54c852f47dd2e04bfa91ddc0770e\": container with ID starting with 6c561ec12c300804a72a4fc232f459a3589d54c852f47dd2e04bfa91ddc0770e not found: ID does not exist" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.994133 4776 scope.go:117] "RemoveContainer" containerID="12d4a1cdc4a774aa398905cc714b5c65ffdbf7ed4e0805a23ca3427a1c2dba58" Dec 08 09:27:31 crc kubenswrapper[4776]: E1208 09:27:31.994674 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12d4a1cdc4a774aa398905cc714b5c65ffdbf7ed4e0805a23ca3427a1c2dba58\": container with ID starting with 12d4a1cdc4a774aa398905cc714b5c65ffdbf7ed4e0805a23ca3427a1c2dba58 not found: ID does not exist" containerID="12d4a1cdc4a774aa398905cc714b5c65ffdbf7ed4e0805a23ca3427a1c2dba58" Dec 08 09:27:31 crc kubenswrapper[4776]: I1208 09:27:31.994717 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d4a1cdc4a774aa398905cc714b5c65ffdbf7ed4e0805a23ca3427a1c2dba58"} err="failed to get container status \"12d4a1cdc4a774aa398905cc714b5c65ffdbf7ed4e0805a23ca3427a1c2dba58\": rpc error: code = NotFound desc = could not find container \"12d4a1cdc4a774aa398905cc714b5c65ffdbf7ed4e0805a23ca3427a1c2dba58\": container with ID starting with 12d4a1cdc4a774aa398905cc714b5c65ffdbf7ed4e0805a23ca3427a1c2dba58 not found: ID does not exist" Dec 08 09:27:32 crc kubenswrapper[4776]: I1208 09:27:32.156784 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-l6w2n" Dec 08 09:27:32 crc kubenswrapper[4776]: I1208 09:27:32.194636 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-4w5k5"] Dec 08 09:27:32 crc kubenswrapper[4776]: I1208 09:27:32.209388 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-4w5k5"] Dec 08 09:27:32 crc kubenswrapper[4776]: I1208 09:27:32.269618 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd51a3a4-205b-4844-81db-439c7e1f0624-combined-ca-bundle\") pod \"cd51a3a4-205b-4844-81db-439c7e1f0624\" (UID: \"cd51a3a4-205b-4844-81db-439c7e1f0624\") " Dec 08 09:27:32 crc kubenswrapper[4776]: I1208 09:27:32.269771 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fqcb\" (UniqueName: \"kubernetes.io/projected/cd51a3a4-205b-4844-81db-439c7e1f0624-kube-api-access-9fqcb\") pod \"cd51a3a4-205b-4844-81db-439c7e1f0624\" (UID: \"cd51a3a4-205b-4844-81db-439c7e1f0624\") " Dec 08 09:27:32 crc kubenswrapper[4776]: I1208 09:27:32.269865 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd51a3a4-205b-4844-81db-439c7e1f0624-config-data\") pod \"cd51a3a4-205b-4844-81db-439c7e1f0624\" (UID: \"cd51a3a4-205b-4844-81db-439c7e1f0624\") " Dec 08 09:27:32 crc kubenswrapper[4776]: I1208 09:27:32.278784 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd51a3a4-205b-4844-81db-439c7e1f0624-kube-api-access-9fqcb" (OuterVolumeSpecName: "kube-api-access-9fqcb") pod "cd51a3a4-205b-4844-81db-439c7e1f0624" (UID: "cd51a3a4-205b-4844-81db-439c7e1f0624"). InnerVolumeSpecName "kube-api-access-9fqcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:27:32 crc kubenswrapper[4776]: I1208 09:27:32.308398 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd51a3a4-205b-4844-81db-439c7e1f0624-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd51a3a4-205b-4844-81db-439c7e1f0624" (UID: "cd51a3a4-205b-4844-81db-439c7e1f0624"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:27:32 crc kubenswrapper[4776]: I1208 09:27:32.360265 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f23b8d9-87af-4550-9bb5-3668b935c6b7" path="/var/lib/kubelet/pods/1f23b8d9-87af-4550-9bb5-3668b935c6b7/volumes" Dec 08 09:27:32 crc kubenswrapper[4776]: I1208 09:27:32.368325 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd51a3a4-205b-4844-81db-439c7e1f0624-config-data" (OuterVolumeSpecName: "config-data") pod "cd51a3a4-205b-4844-81db-439c7e1f0624" (UID: "cd51a3a4-205b-4844-81db-439c7e1f0624"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:27:32 crc kubenswrapper[4776]: I1208 09:27:32.372920 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd51a3a4-205b-4844-81db-439c7e1f0624-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:32 crc kubenswrapper[4776]: I1208 09:27:32.372954 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fqcb\" (UniqueName: \"kubernetes.io/projected/cd51a3a4-205b-4844-81db-439c7e1f0624-kube-api-access-9fqcb\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:32 crc kubenswrapper[4776]: I1208 09:27:32.372968 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd51a3a4-205b-4844-81db-439c7e1f0624-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:32 crc kubenswrapper[4776]: I1208 09:27:32.838825 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-l6w2n" Dec 08 09:27:32 crc kubenswrapper[4776]: I1208 09:27:32.838819 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-l6w2n" event={"ID":"cd51a3a4-205b-4844-81db-439c7e1f0624","Type":"ContainerDied","Data":"5ebd1568b457a27974f4c4735c765ba91e84386efb324952908953814a9372c0"} Dec 08 09:27:32 crc kubenswrapper[4776]: I1208 09:27:32.839277 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ebd1568b457a27974f4c4735c765ba91e84386efb324952908953814a9372c0" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.771927 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-589b85487f-7v8kk"] Dec 08 09:27:33 crc kubenswrapper[4776]: E1208 09:27:33.772531 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6c0b05-3b5e-465b-8edc-f276b2ff8c42" containerName="init" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.772556 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6c0b05-3b5e-465b-8edc-f276b2ff8c42" containerName="init" Dec 08 09:27:33 crc kubenswrapper[4776]: E1208 09:27:33.772575 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f23b8d9-87af-4550-9bb5-3668b935c6b7" containerName="init" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.772583 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f23b8d9-87af-4550-9bb5-3668b935c6b7" containerName="init" Dec 08 09:27:33 crc kubenswrapper[4776]: E1208 09:27:33.772605 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6c0b05-3b5e-465b-8edc-f276b2ff8c42" containerName="dnsmasq-dns" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.772613 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6c0b05-3b5e-465b-8edc-f276b2ff8c42" containerName="dnsmasq-dns" Dec 08 09:27:33 crc kubenswrapper[4776]: E1208 09:27:33.772633 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd51a3a4-205b-4844-81db-439c7e1f0624" containerName="heat-db-sync" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.772640 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd51a3a4-205b-4844-81db-439c7e1f0624" containerName="heat-db-sync" Dec 08 09:27:33 crc kubenswrapper[4776]: E1208 09:27:33.772659 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f23b8d9-87af-4550-9bb5-3668b935c6b7" containerName="dnsmasq-dns" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.772668 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f23b8d9-87af-4550-9bb5-3668b935c6b7" containerName="dnsmasq-dns" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.772965 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd51a3a4-205b-4844-81db-439c7e1f0624" containerName="heat-db-sync" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.772998 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f23b8d9-87af-4550-9bb5-3668b935c6b7" containerName="dnsmasq-dns" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.773015 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f6c0b05-3b5e-465b-8edc-f276b2ff8c42" containerName="dnsmasq-dns" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.774226 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-589b85487f-7v8kk" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.790288 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-589b85487f-7v8kk"] Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.825240 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-55fd4bf697-njsxk"] Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.826648 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.841791 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5fdf94c698-qz6j8"] Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.845783 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.864136 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-55fd4bf697-njsxk"] Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.878322 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5fdf94c698-qz6j8"] Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.911331 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-internal-tls-certs\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.911374 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c891ff5-fbcf-46b6-bace-6ef62df3c0b9-config-data\") pod \"heat-engine-589b85487f-7v8kk\" (UID: \"5c891ff5-fbcf-46b6-bace-6ef62df3c0b9\") " pod="openstack/heat-engine-589b85487f-7v8kk" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.911401 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c891ff5-fbcf-46b6-bace-6ef62df3c0b9-config-data-custom\") pod \"heat-engine-589b85487f-7v8kk\" (UID: \"5c891ff5-fbcf-46b6-bace-6ef62df3c0b9\") " pod="openstack/heat-engine-589b85487f-7v8kk" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.911439 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-config-data-custom\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.911550 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-public-tls-certs\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.911627 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-public-tls-certs\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.911646 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-internal-tls-certs\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.911720 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-config-data-custom\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.911776 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr4bm\" (UniqueName: \"kubernetes.io/projected/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-kube-api-access-qr4bm\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.911886 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c891ff5-fbcf-46b6-bace-6ef62df3c0b9-combined-ca-bundle\") pod \"heat-engine-589b85487f-7v8kk\" (UID: \"5c891ff5-fbcf-46b6-bace-6ef62df3c0b9\") " pod="openstack/heat-engine-589b85487f-7v8kk" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.911969 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn9cn\" (UniqueName: \"kubernetes.io/projected/5c891ff5-fbcf-46b6-bace-6ef62df3c0b9-kube-api-access-qn9cn\") pod \"heat-engine-589b85487f-7v8kk\" (UID: \"5c891ff5-fbcf-46b6-bace-6ef62df3c0b9\") " pod="openstack/heat-engine-589b85487f-7v8kk" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.912068 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-combined-ca-bundle\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.912113 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-combined-ca-bundle\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.912150 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhdx5\" (UniqueName: \"kubernetes.io/projected/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-kube-api-access-qhdx5\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.912210 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-config-data\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:33 crc kubenswrapper[4776]: I1208 09:27:33.912316 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-config-data\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.014982 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-public-tls-certs\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.015036 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-public-tls-certs\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.015057 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-internal-tls-certs\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.015094 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-config-data-custom\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.015125 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr4bm\" (UniqueName: \"kubernetes.io/projected/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-kube-api-access-qr4bm\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.015192 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c891ff5-fbcf-46b6-bace-6ef62df3c0b9-combined-ca-bundle\") pod \"heat-engine-589b85487f-7v8kk\" (UID: \"5c891ff5-fbcf-46b6-bace-6ef62df3c0b9\") " pod="openstack/heat-engine-589b85487f-7v8kk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.015240 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn9cn\" (UniqueName: \"kubernetes.io/projected/5c891ff5-fbcf-46b6-bace-6ef62df3c0b9-kube-api-access-qn9cn\") pod \"heat-engine-589b85487f-7v8kk\" (UID: \"5c891ff5-fbcf-46b6-bace-6ef62df3c0b9\") " pod="openstack/heat-engine-589b85487f-7v8kk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.015289 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-combined-ca-bundle\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.015316 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-combined-ca-bundle\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.015335 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhdx5\" (UniqueName: \"kubernetes.io/projected/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-kube-api-access-qhdx5\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.015357 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-config-data\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.015377 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-config-data\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.015413 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-internal-tls-certs\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.015433 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c891ff5-fbcf-46b6-bace-6ef62df3c0b9-config-data\") pod \"heat-engine-589b85487f-7v8kk\" (UID: \"5c891ff5-fbcf-46b6-bace-6ef62df3c0b9\") " pod="openstack/heat-engine-589b85487f-7v8kk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.015459 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c891ff5-fbcf-46b6-bace-6ef62df3c0b9-config-data-custom\") pod \"heat-engine-589b85487f-7v8kk\" (UID: \"5c891ff5-fbcf-46b6-bace-6ef62df3c0b9\") " pod="openstack/heat-engine-589b85487f-7v8kk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.015491 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-config-data-custom\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.023983 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c891ff5-fbcf-46b6-bace-6ef62df3c0b9-config-data\") pod \"heat-engine-589b85487f-7v8kk\" (UID: \"5c891ff5-fbcf-46b6-bace-6ef62df3c0b9\") " pod="openstack/heat-engine-589b85487f-7v8kk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.026950 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-internal-tls-certs\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.027310 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c891ff5-fbcf-46b6-bace-6ef62df3c0b9-combined-ca-bundle\") pod \"heat-engine-589b85487f-7v8kk\" (UID: \"5c891ff5-fbcf-46b6-bace-6ef62df3c0b9\") " pod="openstack/heat-engine-589b85487f-7v8kk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.027389 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-config-data\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.027529 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-config-data-custom\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.027984 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-public-tls-certs\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.028132 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-combined-ca-bundle\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.028407 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-public-tls-certs\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.029185 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c891ff5-fbcf-46b6-bace-6ef62df3c0b9-config-data-custom\") pod \"heat-engine-589b85487f-7v8kk\" (UID: \"5c891ff5-fbcf-46b6-bace-6ef62df3c0b9\") " pod="openstack/heat-engine-589b85487f-7v8kk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.030121 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-config-data\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.031026 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-combined-ca-bundle\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.032928 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-internal-tls-certs\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.034305 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhdx5\" (UniqueName: \"kubernetes.io/projected/b7f47153-4e65-48b9-816d-4c83b0b0d8a4-kube-api-access-qhdx5\") pod \"heat-api-55fd4bf697-njsxk\" (UID: \"b7f47153-4e65-48b9-816d-4c83b0b0d8a4\") " pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.036783 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-config-data-custom\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.037569 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr4bm\" (UniqueName: \"kubernetes.io/projected/5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4-kube-api-access-qr4bm\") pod \"heat-cfnapi-5fdf94c698-qz6j8\" (UID: \"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4\") " pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.039464 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn9cn\" (UniqueName: \"kubernetes.io/projected/5c891ff5-fbcf-46b6-bace-6ef62df3c0b9-kube-api-access-qn9cn\") pod \"heat-engine-589b85487f-7v8kk\" (UID: \"5c891ff5-fbcf-46b6-bace-6ef62df3c0b9\") " pod="openstack/heat-engine-589b85487f-7v8kk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.096241 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-589b85487f-7v8kk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.173501 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.182267 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.794141 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-589b85487f-7v8kk"] Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.943078 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-589b85487f-7v8kk" event={"ID":"5c891ff5-fbcf-46b6-bace-6ef62df3c0b9","Type":"ContainerStarted","Data":"88ad7cc04724fbe12d9d7745a67e2b681e1b3a90e819557fe696fcdbf807cc66"} Dec 08 09:27:34 crc kubenswrapper[4776]: I1208 09:27:34.956951 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5fdf94c698-qz6j8"] Dec 08 09:27:35 crc kubenswrapper[4776]: I1208 09:27:35.120623 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-55fd4bf697-njsxk"] Dec 08 09:27:35 crc kubenswrapper[4776]: W1208 09:27:35.126136 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7f47153_4e65_48b9_816d_4c83b0b0d8a4.slice/crio-40a36af00a91e23c751a5def7ee195fa5a9545575dbf6cc0df9c6831f4d354cf WatchSource:0}: Error finding container 40a36af00a91e23c751a5def7ee195fa5a9545575dbf6cc0df9c6831f4d354cf: Status 404 returned error can't find the container with id 40a36af00a91e23c751a5def7ee195fa5a9545575dbf6cc0df9c6831f4d354cf Dec 08 09:27:35 crc kubenswrapper[4776]: I1208 09:27:35.956853 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-55fd4bf697-njsxk" event={"ID":"b7f47153-4e65-48b9-816d-4c83b0b0d8a4","Type":"ContainerStarted","Data":"40a36af00a91e23c751a5def7ee195fa5a9545575dbf6cc0df9c6831f4d354cf"} Dec 08 09:27:35 crc kubenswrapper[4776]: I1208 09:27:35.958475 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" event={"ID":"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4","Type":"ContainerStarted","Data":"22d86011c6db0435f7174b352716400f665a991b15f47497a2134ef1527ee2a0"} Dec 08 09:27:35 crc kubenswrapper[4776]: I1208 09:27:35.964289 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-589b85487f-7v8kk" event={"ID":"5c891ff5-fbcf-46b6-bace-6ef62df3c0b9","Type":"ContainerStarted","Data":"aaf49dc9b3f3e050d8055222b48b4db10f9135f9ba12e4a4e6fce06ad4fca431"} Dec 08 09:27:35 crc kubenswrapper[4776]: I1208 09:27:35.964878 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-589b85487f-7v8kk" Dec 08 09:27:35 crc kubenswrapper[4776]: I1208 09:27:35.987413 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-589b85487f-7v8kk" podStartSLOduration=2.987394955 podStartE2EDuration="2.987394955s" podCreationTimestamp="2025-12-08 09:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:27:35.981417623 +0000 UTC m=+1732.244642645" watchObservedRunningTime="2025-12-08 09:27:35.987394955 +0000 UTC m=+1732.250619977" Dec 08 09:27:36 crc kubenswrapper[4776]: I1208 09:27:36.983636 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" event={"ID":"5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4","Type":"ContainerStarted","Data":"ae2890e3d9d67a3cfeeba9523ca958fdd95c6399e3afeb50bbf440a7e6f1b2d2"} Dec 08 09:27:36 crc kubenswrapper[4776]: I1208 09:27:36.984420 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:36 crc kubenswrapper[4776]: I1208 09:27:36.988395 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-55fd4bf697-njsxk" event={"ID":"b7f47153-4e65-48b9-816d-4c83b0b0d8a4","Type":"ContainerStarted","Data":"172190aeefa1f0b2e376b86293ec058e677f6e0d54144e8d00ebeccf251d43a2"} Dec 08 09:27:37 crc kubenswrapper[4776]: I1208 09:27:37.008284 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" podStartSLOduration=2.683433274 podStartE2EDuration="4.008265792s" podCreationTimestamp="2025-12-08 09:27:33 +0000 UTC" firstStartedPulling="2025-12-08 09:27:34.981143459 +0000 UTC m=+1731.244368481" lastFinishedPulling="2025-12-08 09:27:36.305975977 +0000 UTC m=+1732.569200999" observedRunningTime="2025-12-08 09:27:37.006112854 +0000 UTC m=+1733.269337876" watchObservedRunningTime="2025-12-08 09:27:37.008265792 +0000 UTC m=+1733.271490814" Dec 08 09:27:37 crc kubenswrapper[4776]: I1208 09:27:37.022286 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-55fd4bf697-njsxk" podStartSLOduration=2.843485146 podStartE2EDuration="4.022270418s" podCreationTimestamp="2025-12-08 09:27:33 +0000 UTC" firstStartedPulling="2025-12-08 09:27:35.132036055 +0000 UTC m=+1731.395261077" lastFinishedPulling="2025-12-08 09:27:36.310821327 +0000 UTC m=+1732.574046349" observedRunningTime="2025-12-08 09:27:37.02157574 +0000 UTC m=+1733.284800752" watchObservedRunningTime="2025-12-08 09:27:37.022270418 +0000 UTC m=+1733.285495440" Dec 08 09:27:37 crc kubenswrapper[4776]: I1208 09:27:37.999342 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:41 crc kubenswrapper[4776]: I1208 09:27:41.344312 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:27:41 crc kubenswrapper[4776]: E1208 09:27:41.345009 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.085385 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq"] Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.087877 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.090096 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.090505 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.090662 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.090824 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.109962 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq"] Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.210674 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7cfc\" (UniqueName: \"kubernetes.io/projected/8b119f36-1ae0-4826-8043-4e038e4398a3-kube-api-access-p7cfc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq\" (UID: \"8b119f36-1ae0-4826-8043-4e038e4398a3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.210727 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq\" (UID: \"8b119f36-1ae0-4826-8043-4e038e4398a3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.210913 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq\" (UID: \"8b119f36-1ae0-4826-8043-4e038e4398a3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.211079 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq\" (UID: \"8b119f36-1ae0-4826-8043-4e038e4398a3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.312721 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq\" (UID: \"8b119f36-1ae0-4826-8043-4e038e4398a3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.312869 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq\" (UID: \"8b119f36-1ae0-4826-8043-4e038e4398a3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.312968 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq\" (UID: \"8b119f36-1ae0-4826-8043-4e038e4398a3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.313024 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7cfc\" (UniqueName: \"kubernetes.io/projected/8b119f36-1ae0-4826-8043-4e038e4398a3-kube-api-access-p7cfc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq\" (UID: \"8b119f36-1ae0-4826-8043-4e038e4398a3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.318394 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq\" (UID: \"8b119f36-1ae0-4826-8043-4e038e4398a3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.318784 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq\" (UID: \"8b119f36-1ae0-4826-8043-4e038e4398a3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.326968 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq\" (UID: \"8b119f36-1ae0-4826-8043-4e038e4398a3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.330144 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7cfc\" (UniqueName: \"kubernetes.io/projected/8b119f36-1ae0-4826-8043-4e038e4398a3-kube-api-access-p7cfc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq\" (UID: \"8b119f36-1ae0-4826-8043-4e038e4398a3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.408053 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.719975 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-5fdf94c698-qz6j8" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.748454 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-55fd4bf697-njsxk" Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.840218 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-58d55cd687-srfgj"] Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.840465 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-58d55cd687-srfgj" podUID="c51b16c8-d0c8-4109-8cb3-f1799ce5f996" containerName="heat-cfnapi" containerID="cri-o://3b0ac0b8fd0ac6f80f7fc079e316e648e1a204fe4a69bd0cda6755200789a450" gracePeriod=60 Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.862334 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7b4dd4bc68-bwx4q"] Dec 08 09:27:45 crc kubenswrapper[4776]: I1208 09:27:45.862663 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-7b4dd4bc68-bwx4q" podUID="dcbcd000-25be-4f44-8114-7602c348b58d" containerName="heat-api" containerID="cri-o://7ac51cbf9a132bb867cd71c7f2264fe5626030d540e35cb954106c984aeff1bf" gracePeriod=60 Dec 08 09:27:46 crc kubenswrapper[4776]: I1208 09:27:46.089104 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq"] Dec 08 09:27:46 crc kubenswrapper[4776]: I1208 09:27:46.104821 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" event={"ID":"8b119f36-1ae0-4826-8043-4e038e4398a3","Type":"ContainerStarted","Data":"1e4ec87896f4f67be5e681ccd3095108f2f476916e6961dbb38f43ad494b15bd"} Dec 08 09:27:49 crc kubenswrapper[4776]: I1208 09:27:49.000008 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-58d55cd687-srfgj" podUID="c51b16c8-d0c8-4109-8cb3-f1799ce5f996" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.224:8000/healthcheck\": read tcp 10.217.0.2:48334->10.217.0.224:8000: read: connection reset by peer" Dec 08 09:27:49 crc kubenswrapper[4776]: I1208 09:27:49.020298 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-7b4dd4bc68-bwx4q" podUID="dcbcd000-25be-4f44-8114-7602c348b58d" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.223:8004/healthcheck\": read tcp 10.217.0.2:59080->10.217.0.223:8004: read: connection reset by peer" Dec 08 09:27:49 crc kubenswrapper[4776]: I1208 09:27:49.143968 4776 generic.go:334] "Generic (PLEG): container finished" podID="87931091-7230-4451-9d94-20ac4b8458bc" containerID="11c9f9a0f1003ca3b3a35014c2bd8214596e154c9f7ded4722c2ab12857dc03f" exitCode=0 Dec 08 09:27:49 crc kubenswrapper[4776]: I1208 09:27:49.144042 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"87931091-7230-4451-9d94-20ac4b8458bc","Type":"ContainerDied","Data":"11c9f9a0f1003ca3b3a35014c2bd8214596e154c9f7ded4722c2ab12857dc03f"} Dec 08 09:27:49 crc kubenswrapper[4776]: I1208 09:27:49.146512 4776 generic.go:334] "Generic (PLEG): container finished" podID="c51b16c8-d0c8-4109-8cb3-f1799ce5f996" containerID="3b0ac0b8fd0ac6f80f7fc079e316e648e1a204fe4a69bd0cda6755200789a450" exitCode=0 Dec 08 09:27:49 crc kubenswrapper[4776]: I1208 09:27:49.146673 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58d55cd687-srfgj" event={"ID":"c51b16c8-d0c8-4109-8cb3-f1799ce5f996","Type":"ContainerDied","Data":"3b0ac0b8fd0ac6f80f7fc079e316e648e1a204fe4a69bd0cda6755200789a450"} Dec 08 09:27:49 crc kubenswrapper[4776]: I1208 09:27:49.148657 4776 generic.go:334] "Generic (PLEG): container finished" podID="dcbcd000-25be-4f44-8114-7602c348b58d" containerID="7ac51cbf9a132bb867cd71c7f2264fe5626030d540e35cb954106c984aeff1bf" exitCode=0 Dec 08 09:27:49 crc kubenswrapper[4776]: I1208 09:27:49.148757 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7b4dd4bc68-bwx4q" event={"ID":"dcbcd000-25be-4f44-8114-7602c348b58d","Type":"ContainerDied","Data":"7ac51cbf9a132bb867cd71c7f2264fe5626030d540e35cb954106c984aeff1bf"} Dec 08 09:27:50 crc kubenswrapper[4776]: I1208 09:27:50.164297 4776 generic.go:334] "Generic (PLEG): container finished" podID="ab6303ff-9104-40ed-babe-1445f4cd89e2" containerID="aa2fa07f20c60c99e5f3ea6f75b2b83cb7b5e204ecb6fb81fb11f41fead2d5ce" exitCode=0 Dec 08 09:27:50 crc kubenswrapper[4776]: I1208 09:27:50.164373 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ab6303ff-9104-40ed-babe-1445f4cd89e2","Type":"ContainerDied","Data":"aa2fa07f20c60c99e5f3ea6f75b2b83cb7b5e204ecb6fb81fb11f41fead2d5ce"} Dec 08 09:27:54 crc kubenswrapper[4776]: I1208 09:27:54.139258 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-589b85487f-7v8kk" Dec 08 09:27:54 crc kubenswrapper[4776]: I1208 09:27:54.193615 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5ccd9d555d-m9chd"] Dec 08 09:27:54 crc kubenswrapper[4776]: I1208 09:27:54.193810 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-5ccd9d555d-m9chd" podUID="40b6ce41-e108-47bf-bc38-34e8c475b413" containerName="heat-engine" containerID="cri-o://6aa12b16012116b86c2b0c97e985bec0ebaeb9a366799a44caf810d29f286456" gracePeriod=60 Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.344322 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.345472 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:27:55 crc kubenswrapper[4776]: E1208 09:27:55.345622 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.374656 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58d55cd687-srfgj" event={"ID":"c51b16c8-d0c8-4109-8cb3-f1799ce5f996","Type":"ContainerDied","Data":"8349833288b381f00a77a298518f5ada646e391d674bd52054cfc0ccdf689c08"} Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.374695 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8349833288b381f00a77a298518f5ada646e391d674bd52054cfc0ccdf689c08" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.392864 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7b4dd4bc68-bwx4q" event={"ID":"dcbcd000-25be-4f44-8114-7602c348b58d","Type":"ContainerDied","Data":"31815974efb56de87ec196dc1a7062ef90234e7777529c7e62e57038509af189"} Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.392942 4776 scope.go:117] "RemoveContainer" containerID="7ac51cbf9a132bb867cd71c7f2264fe5626030d540e35cb954106c984aeff1bf" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.393115 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7b4dd4bc68-bwx4q" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.394940 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.439048 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx6p7\" (UniqueName: \"kubernetes.io/projected/dcbcd000-25be-4f44-8114-7602c348b58d-kube-api-access-bx6p7\") pod \"dcbcd000-25be-4f44-8114-7602c348b58d\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.439084 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-public-tls-certs\") pod \"dcbcd000-25be-4f44-8114-7602c348b58d\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.439122 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-config-data-custom\") pod \"dcbcd000-25be-4f44-8114-7602c348b58d\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.439313 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qhmd\" (UniqueName: \"kubernetes.io/projected/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-kube-api-access-4qhmd\") pod \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.439333 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-internal-tls-certs\") pod \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.439348 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-internal-tls-certs\") pod \"dcbcd000-25be-4f44-8114-7602c348b58d\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.439434 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-public-tls-certs\") pod \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.439474 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-config-data\") pod \"dcbcd000-25be-4f44-8114-7602c348b58d\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.439506 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-config-data\") pod \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.439529 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-config-data-custom\") pod \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.439554 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-combined-ca-bundle\") pod \"dcbcd000-25be-4f44-8114-7602c348b58d\" (UID: \"dcbcd000-25be-4f44-8114-7602c348b58d\") " Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.439582 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-combined-ca-bundle\") pod \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\" (UID: \"c51b16c8-d0c8-4109-8cb3-f1799ce5f996\") " Dec 08 09:27:55 crc kubenswrapper[4776]: E1208 09:27:55.442552 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6aa12b16012116b86c2b0c97e985bec0ebaeb9a366799a44caf810d29f286456" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 08 09:27:55 crc kubenswrapper[4776]: E1208 09:27:55.446085 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6aa12b16012116b86c2b0c97e985bec0ebaeb9a366799a44caf810d29f286456" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.446720 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dcbcd000-25be-4f44-8114-7602c348b58d" (UID: "dcbcd000-25be-4f44-8114-7602c348b58d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:27:55 crc kubenswrapper[4776]: E1208 09:27:55.447337 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6aa12b16012116b86c2b0c97e985bec0ebaeb9a366799a44caf810d29f286456" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 08 09:27:55 crc kubenswrapper[4776]: E1208 09:27:55.447379 4776 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5ccd9d555d-m9chd" podUID="40b6ce41-e108-47bf-bc38-34e8c475b413" containerName="heat-engine" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.448943 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c51b16c8-d0c8-4109-8cb3-f1799ce5f996" (UID: "c51b16c8-d0c8-4109-8cb3-f1799ce5f996"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.453093 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-kube-api-access-4qhmd" (OuterVolumeSpecName: "kube-api-access-4qhmd") pod "c51b16c8-d0c8-4109-8cb3-f1799ce5f996" (UID: "c51b16c8-d0c8-4109-8cb3-f1799ce5f996"). InnerVolumeSpecName "kube-api-access-4qhmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.459633 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcbcd000-25be-4f44-8114-7602c348b58d-kube-api-access-bx6p7" (OuterVolumeSpecName: "kube-api-access-bx6p7") pod "dcbcd000-25be-4f44-8114-7602c348b58d" (UID: "dcbcd000-25be-4f44-8114-7602c348b58d"). InnerVolumeSpecName "kube-api-access-bx6p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.542537 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx6p7\" (UniqueName: \"kubernetes.io/projected/dcbcd000-25be-4f44-8114-7602c348b58d-kube-api-access-bx6p7\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.542564 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.542574 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qhmd\" (UniqueName: \"kubernetes.io/projected/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-kube-api-access-4qhmd\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.542584 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.664688 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c51b16c8-d0c8-4109-8cb3-f1799ce5f996" (UID: "c51b16c8-d0c8-4109-8cb3-f1799ce5f996"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.700353 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcbcd000-25be-4f44-8114-7602c348b58d" (UID: "dcbcd000-25be-4f44-8114-7602c348b58d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.705215 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-config-data" (OuterVolumeSpecName: "config-data") pod "dcbcd000-25be-4f44-8114-7602c348b58d" (UID: "dcbcd000-25be-4f44-8114-7602c348b58d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.712410 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-config-data" (OuterVolumeSpecName: "config-data") pod "c51b16c8-d0c8-4109-8cb3-f1799ce5f996" (UID: "c51b16c8-d0c8-4109-8cb3-f1799ce5f996"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.718229 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dcbcd000-25be-4f44-8114-7602c348b58d" (UID: "dcbcd000-25be-4f44-8114-7602c348b58d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.738406 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dcbcd000-25be-4f44-8114-7602c348b58d" (UID: "dcbcd000-25be-4f44-8114-7602c348b58d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.747892 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.747925 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.747937 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.747948 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.747958 4776 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.747966 4776 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcbcd000-25be-4f44-8114-7602c348b58d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.765794 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c51b16c8-d0c8-4109-8cb3-f1799ce5f996" (UID: "c51b16c8-d0c8-4109-8cb3-f1799ce5f996"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.768120 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c51b16c8-d0c8-4109-8cb3-f1799ce5f996" (UID: "c51b16c8-d0c8-4109-8cb3-f1799ce5f996"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.849907 4776 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:55 crc kubenswrapper[4776]: I1208 09:27:55.849936 4776 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c51b16c8-d0c8-4109-8cb3-f1799ce5f996-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 09:27:56 crc kubenswrapper[4776]: I1208 09:27:56.152445 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7b4dd4bc68-bwx4q"] Dec 08 09:27:56 crc kubenswrapper[4776]: I1208 09:27:56.169243 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7b4dd4bc68-bwx4q"] Dec 08 09:27:56 crc kubenswrapper[4776]: I1208 09:27:56.356851 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcbcd000-25be-4f44-8114-7602c348b58d" path="/var/lib/kubelet/pods/dcbcd000-25be-4f44-8114-7602c348b58d/volumes" Dec 08 09:27:56 crc kubenswrapper[4776]: I1208 09:27:56.405430 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"87931091-7230-4451-9d94-20ac4b8458bc","Type":"ContainerStarted","Data":"7de2cd265f11eaf55214a8c085a5effc820c4ff674928a3ea29a4cb1bc312fe3"} Dec 08 09:27:56 crc kubenswrapper[4776]: I1208 09:27:56.406451 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:27:56 crc kubenswrapper[4776]: I1208 09:27:56.409515 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" event={"ID":"8b119f36-1ae0-4826-8043-4e038e4398a3","Type":"ContainerStarted","Data":"30c4e16130370d67ff944f7e92d94f8a046bdb255843794fbc6d6869d70d79c7"} Dec 08 09:27:56 crc kubenswrapper[4776]: I1208 09:27:56.411720 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58d55cd687-srfgj" Dec 08 09:27:56 crc kubenswrapper[4776]: I1208 09:27:56.411726 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ab6303ff-9104-40ed-babe-1445f4cd89e2","Type":"ContainerStarted","Data":"9d5585698aa2f6a45d63dff32daf999b3473869f46a18bb94bcc01b2c5440ad3"} Dec 08 09:27:56 crc kubenswrapper[4776]: I1208 09:27:56.412088 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 08 09:27:56 crc kubenswrapper[4776]: I1208 09:27:56.438502 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.438481771 podStartE2EDuration="43.438481771s" podCreationTimestamp="2025-12-08 09:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:27:56.432671855 +0000 UTC m=+1752.695896897" watchObservedRunningTime="2025-12-08 09:27:56.438481771 +0000 UTC m=+1752.701706793" Dec 08 09:27:56 crc kubenswrapper[4776]: I1208 09:27:56.452071 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" podStartSLOduration=2.38441382 podStartE2EDuration="11.452051026s" podCreationTimestamp="2025-12-08 09:27:45 +0000 UTC" firstStartedPulling="2025-12-08 09:27:46.080589026 +0000 UTC m=+1742.343814048" lastFinishedPulling="2025-12-08 09:27:55.148226232 +0000 UTC m=+1751.411451254" observedRunningTime="2025-12-08 09:27:56.448711786 +0000 UTC m=+1752.711936808" watchObservedRunningTime="2025-12-08 09:27:56.452051026 +0000 UTC m=+1752.715276048" Dec 08 09:27:56 crc kubenswrapper[4776]: I1208 09:27:56.473717 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-58d55cd687-srfgj"] Dec 08 09:27:56 crc kubenswrapper[4776]: I1208 09:27:56.484360 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-58d55cd687-srfgj"] Dec 08 09:27:56 crc kubenswrapper[4776]: I1208 09:27:56.496837 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=42.496821429 podStartE2EDuration="42.496821429s" podCreationTimestamp="2025-12-08 09:27:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 09:27:56.4875787 +0000 UTC m=+1752.750803722" watchObservedRunningTime="2025-12-08 09:27:56.496821429 +0000 UTC m=+1752.760046451" Dec 08 09:27:57 crc kubenswrapper[4776]: I1208 09:27:57.867463 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-47z2b"] Dec 08 09:27:57 crc kubenswrapper[4776]: I1208 09:27:57.882811 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-47z2b"] Dec 08 09:27:57 crc kubenswrapper[4776]: I1208 09:27:57.915698 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-x7p68"] Dec 08 09:27:57 crc kubenswrapper[4776]: E1208 09:27:57.916269 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c51b16c8-d0c8-4109-8cb3-f1799ce5f996" containerName="heat-cfnapi" Dec 08 09:27:57 crc kubenswrapper[4776]: I1208 09:27:57.916286 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c51b16c8-d0c8-4109-8cb3-f1799ce5f996" containerName="heat-cfnapi" Dec 08 09:27:57 crc kubenswrapper[4776]: E1208 09:27:57.916342 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcbcd000-25be-4f44-8114-7602c348b58d" containerName="heat-api" Dec 08 09:27:57 crc kubenswrapper[4776]: I1208 09:27:57.916348 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcbcd000-25be-4f44-8114-7602c348b58d" containerName="heat-api" Dec 08 09:27:57 crc kubenswrapper[4776]: I1208 09:27:57.916588 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcbcd000-25be-4f44-8114-7602c348b58d" containerName="heat-api" Dec 08 09:27:57 crc kubenswrapper[4776]: I1208 09:27:57.916610 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c51b16c8-d0c8-4109-8cb3-f1799ce5f996" containerName="heat-cfnapi" Dec 08 09:27:57 crc kubenswrapper[4776]: I1208 09:27:57.917627 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-x7p68" Dec 08 09:27:57 crc kubenswrapper[4776]: I1208 09:27:57.919637 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 08 09:27:57 crc kubenswrapper[4776]: I1208 09:27:57.928129 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-x7p68"] Dec 08 09:27:58 crc kubenswrapper[4776]: I1208 09:27:58.023367 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwt7m\" (UniqueName: \"kubernetes.io/projected/ec813763-b43e-4a53-a048-615c313d130a-kube-api-access-zwt7m\") pod \"aodh-db-sync-x7p68\" (UID: \"ec813763-b43e-4a53-a048-615c313d130a\") " pod="openstack/aodh-db-sync-x7p68" Dec 08 09:27:58 crc kubenswrapper[4776]: I1208 09:27:58.023433 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-scripts\") pod \"aodh-db-sync-x7p68\" (UID: \"ec813763-b43e-4a53-a048-615c313d130a\") " pod="openstack/aodh-db-sync-x7p68" Dec 08 09:27:58 crc kubenswrapper[4776]: I1208 09:27:58.023587 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-config-data\") pod \"aodh-db-sync-x7p68\" (UID: \"ec813763-b43e-4a53-a048-615c313d130a\") " pod="openstack/aodh-db-sync-x7p68" Dec 08 09:27:58 crc kubenswrapper[4776]: I1208 09:27:58.023671 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-combined-ca-bundle\") pod \"aodh-db-sync-x7p68\" (UID: \"ec813763-b43e-4a53-a048-615c313d130a\") " pod="openstack/aodh-db-sync-x7p68" Dec 08 09:27:58 crc kubenswrapper[4776]: I1208 09:27:58.126236 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-config-data\") pod \"aodh-db-sync-x7p68\" (UID: \"ec813763-b43e-4a53-a048-615c313d130a\") " pod="openstack/aodh-db-sync-x7p68" Dec 08 09:27:58 crc kubenswrapper[4776]: I1208 09:27:58.127140 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-combined-ca-bundle\") pod \"aodh-db-sync-x7p68\" (UID: \"ec813763-b43e-4a53-a048-615c313d130a\") " pod="openstack/aodh-db-sync-x7p68" Dec 08 09:27:58 crc kubenswrapper[4776]: I1208 09:27:58.127382 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwt7m\" (UniqueName: \"kubernetes.io/projected/ec813763-b43e-4a53-a048-615c313d130a-kube-api-access-zwt7m\") pod \"aodh-db-sync-x7p68\" (UID: \"ec813763-b43e-4a53-a048-615c313d130a\") " pod="openstack/aodh-db-sync-x7p68" Dec 08 09:27:58 crc kubenswrapper[4776]: I1208 09:27:58.127456 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-scripts\") pod \"aodh-db-sync-x7p68\" (UID: \"ec813763-b43e-4a53-a048-615c313d130a\") " pod="openstack/aodh-db-sync-x7p68" Dec 08 09:27:58 crc kubenswrapper[4776]: I1208 09:27:58.132247 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-config-data\") pod \"aodh-db-sync-x7p68\" (UID: \"ec813763-b43e-4a53-a048-615c313d130a\") " pod="openstack/aodh-db-sync-x7p68" Dec 08 09:27:58 crc kubenswrapper[4776]: I1208 09:27:58.147141 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwt7m\" (UniqueName: \"kubernetes.io/projected/ec813763-b43e-4a53-a048-615c313d130a-kube-api-access-zwt7m\") pod \"aodh-db-sync-x7p68\" (UID: \"ec813763-b43e-4a53-a048-615c313d130a\") " pod="openstack/aodh-db-sync-x7p68" Dec 08 09:27:58 crc kubenswrapper[4776]: I1208 09:27:58.147705 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-scripts\") pod \"aodh-db-sync-x7p68\" (UID: \"ec813763-b43e-4a53-a048-615c313d130a\") " pod="openstack/aodh-db-sync-x7p68" Dec 08 09:27:58 crc kubenswrapper[4776]: I1208 09:27:58.150374 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-combined-ca-bundle\") pod \"aodh-db-sync-x7p68\" (UID: \"ec813763-b43e-4a53-a048-615c313d130a\") " pod="openstack/aodh-db-sync-x7p68" Dec 08 09:27:58 crc kubenswrapper[4776]: I1208 09:27:58.241276 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-x7p68" Dec 08 09:27:58 crc kubenswrapper[4776]: I1208 09:27:58.395648 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f3c7425-49ed-4491-9422-4d50616e53c4" path="/var/lib/kubelet/pods/4f3c7425-49ed-4491-9422-4d50616e53c4/volumes" Dec 08 09:27:58 crc kubenswrapper[4776]: I1208 09:27:58.396739 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c51b16c8-d0c8-4109-8cb3-f1799ce5f996" path="/var/lib/kubelet/pods/c51b16c8-d0c8-4109-8cb3-f1799ce5f996/volumes" Dec 08 09:27:58 crc kubenswrapper[4776]: W1208 09:27:58.757562 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec813763_b43e_4a53_a048_615c313d130a.slice/crio-4e7c83c397e29508263fdbaf7d23fb599b7b77f9c5a899c59d6869e9050302e7 WatchSource:0}: Error finding container 4e7c83c397e29508263fdbaf7d23fb599b7b77f9c5a899c59d6869e9050302e7: Status 404 returned error can't find the container with id 4e7c83c397e29508263fdbaf7d23fb599b7b77f9c5a899c59d6869e9050302e7 Dec 08 09:27:58 crc kubenswrapper[4776]: I1208 09:27:58.769272 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-x7p68"] Dec 08 09:27:59 crc kubenswrapper[4776]: I1208 09:27:59.465549 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-x7p68" event={"ID":"ec813763-b43e-4a53-a048-615c313d130a","Type":"ContainerStarted","Data":"4e7c83c397e29508263fdbaf7d23fb599b7b77f9c5a899c59d6869e9050302e7"} Dec 08 09:28:01 crc kubenswrapper[4776]: I1208 09:28:01.284673 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-7b4dd4bc68-bwx4q" podUID="dcbcd000-25be-4f44-8114-7602c348b58d" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.223:8004/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 08 09:28:01 crc kubenswrapper[4776]: I1208 09:28:01.376524 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-58d55cd687-srfgj" podUID="c51b16c8-d0c8-4109-8cb3-f1799ce5f996" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.224:8000/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 08 09:28:04 crc kubenswrapper[4776]: I1208 09:28:04.555257 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-x7p68" event={"ID":"ec813763-b43e-4a53-a048-615c313d130a","Type":"ContainerStarted","Data":"d85a5b1b9612ad13bce283313fe73d9e689f5e6f3d170da35fef9e8755b654bf"} Dec 08 09:28:05 crc kubenswrapper[4776]: I1208 09:28:05.040531 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="ab6303ff-9104-40ed-babe-1445f4cd89e2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.10:5671: connect: connection refused" Dec 08 09:28:05 crc kubenswrapper[4776]: E1208 09:28:05.437072 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6aa12b16012116b86c2b0c97e985bec0ebaeb9a366799a44caf810d29f286456" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 08 09:28:05 crc kubenswrapper[4776]: E1208 09:28:05.439061 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6aa12b16012116b86c2b0c97e985bec0ebaeb9a366799a44caf810d29f286456" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 08 09:28:05 crc kubenswrapper[4776]: E1208 09:28:05.440927 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6aa12b16012116b86c2b0c97e985bec0ebaeb9a366799a44caf810d29f286456" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 08 09:28:05 crc kubenswrapper[4776]: E1208 09:28:05.441023 4776 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5ccd9d555d-m9chd" podUID="40b6ce41-e108-47bf-bc38-34e8c475b413" containerName="heat-engine" Dec 08 09:28:06 crc kubenswrapper[4776]: I1208 09:28:06.344357 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:28:06 crc kubenswrapper[4776]: E1208 09:28:06.345395 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:28:06 crc kubenswrapper[4776]: I1208 09:28:06.582689 4776 generic.go:334] "Generic (PLEG): container finished" podID="8b119f36-1ae0-4826-8043-4e038e4398a3" containerID="30c4e16130370d67ff944f7e92d94f8a046bdb255843794fbc6d6869d70d79c7" exitCode=0 Dec 08 09:28:06 crc kubenswrapper[4776]: I1208 09:28:06.582796 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" event={"ID":"8b119f36-1ae0-4826-8043-4e038e4398a3","Type":"ContainerDied","Data":"30c4e16130370d67ff944f7e92d94f8a046bdb255843794fbc6d6869d70d79c7"} Dec 08 09:28:06 crc kubenswrapper[4776]: I1208 09:28:06.585064 4776 generic.go:334] "Generic (PLEG): container finished" podID="ec813763-b43e-4a53-a048-615c313d130a" containerID="d85a5b1b9612ad13bce283313fe73d9e689f5e6f3d170da35fef9e8755b654bf" exitCode=0 Dec 08 09:28:06 crc kubenswrapper[4776]: I1208 09:28:06.585096 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-x7p68" event={"ID":"ec813763-b43e-4a53-a048-615c313d130a","Type":"ContainerDied","Data":"d85a5b1b9612ad13bce283313fe73d9e689f5e6f3d170da35fef9e8755b654bf"} Dec 08 09:28:06 crc kubenswrapper[4776]: I1208 09:28:06.610549 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-x7p68" podStartSLOduration=4.500998709 podStartE2EDuration="9.610528966s" podCreationTimestamp="2025-12-08 09:27:57 +0000 UTC" firstStartedPulling="2025-12-08 09:27:58.759883043 +0000 UTC m=+1755.023108055" lastFinishedPulling="2025-12-08 09:28:03.86941329 +0000 UTC m=+1760.132638312" observedRunningTime="2025-12-08 09:28:04.573973622 +0000 UTC m=+1760.837198684" watchObservedRunningTime="2025-12-08 09:28:06.610528966 +0000 UTC m=+1762.873753988" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.169834 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-x7p68" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.173639 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.192093 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwt7m\" (UniqueName: \"kubernetes.io/projected/ec813763-b43e-4a53-a048-615c313d130a-kube-api-access-zwt7m\") pod \"ec813763-b43e-4a53-a048-615c313d130a\" (UID: \"ec813763-b43e-4a53-a048-615c313d130a\") " Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.192306 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-scripts\") pod \"ec813763-b43e-4a53-a048-615c313d130a\" (UID: \"ec813763-b43e-4a53-a048-615c313d130a\") " Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.192361 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-repo-setup-combined-ca-bundle\") pod \"8b119f36-1ae0-4826-8043-4e038e4398a3\" (UID: \"8b119f36-1ae0-4826-8043-4e038e4398a3\") " Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.192425 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-combined-ca-bundle\") pod \"ec813763-b43e-4a53-a048-615c313d130a\" (UID: \"ec813763-b43e-4a53-a048-615c313d130a\") " Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.192453 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-inventory\") pod \"8b119f36-1ae0-4826-8043-4e038e4398a3\" (UID: \"8b119f36-1ae0-4826-8043-4e038e4398a3\") " Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.192494 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7cfc\" (UniqueName: \"kubernetes.io/projected/8b119f36-1ae0-4826-8043-4e038e4398a3-kube-api-access-p7cfc\") pod \"8b119f36-1ae0-4826-8043-4e038e4398a3\" (UID: \"8b119f36-1ae0-4826-8043-4e038e4398a3\") " Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.192557 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-ssh-key\") pod \"8b119f36-1ae0-4826-8043-4e038e4398a3\" (UID: \"8b119f36-1ae0-4826-8043-4e038e4398a3\") " Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.193556 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-config-data\") pod \"ec813763-b43e-4a53-a048-615c313d130a\" (UID: \"ec813763-b43e-4a53-a048-615c313d130a\") " Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.200267 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b119f36-1ae0-4826-8043-4e038e4398a3-kube-api-access-p7cfc" (OuterVolumeSpecName: "kube-api-access-p7cfc") pod "8b119f36-1ae0-4826-8043-4e038e4398a3" (UID: "8b119f36-1ae0-4826-8043-4e038e4398a3"). InnerVolumeSpecName "kube-api-access-p7cfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.201334 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8b119f36-1ae0-4826-8043-4e038e4398a3" (UID: "8b119f36-1ae0-4826-8043-4e038e4398a3"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.203699 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec813763-b43e-4a53-a048-615c313d130a-kube-api-access-zwt7m" (OuterVolumeSpecName: "kube-api-access-zwt7m") pod "ec813763-b43e-4a53-a048-615c313d130a" (UID: "ec813763-b43e-4a53-a048-615c313d130a"). InnerVolumeSpecName "kube-api-access-zwt7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.217520 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-scripts" (OuterVolumeSpecName: "scripts") pod "ec813763-b43e-4a53-a048-615c313d130a" (UID: "ec813763-b43e-4a53-a048-615c313d130a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.253066 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8b119f36-1ae0-4826-8043-4e038e4398a3" (UID: "8b119f36-1ae0-4826-8043-4e038e4398a3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.260412 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-inventory" (OuterVolumeSpecName: "inventory") pod "8b119f36-1ae0-4826-8043-4e038e4398a3" (UID: "8b119f36-1ae0-4826-8043-4e038e4398a3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.278741 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec813763-b43e-4a53-a048-615c313d130a" (UID: "ec813763-b43e-4a53-a048-615c313d130a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.281371 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-config-data" (OuterVolumeSpecName: "config-data") pod "ec813763-b43e-4a53-a048-615c313d130a" (UID: "ec813763-b43e-4a53-a048-615c313d130a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.297304 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwt7m\" (UniqueName: \"kubernetes.io/projected/ec813763-b43e-4a53-a048-615c313d130a-kube-api-access-zwt7m\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.297353 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.297369 4776 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.297389 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.297404 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.297416 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7cfc\" (UniqueName: \"kubernetes.io/projected/8b119f36-1ae0-4826-8043-4e038e4398a3-kube-api-access-p7cfc\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.297429 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b119f36-1ae0-4826-8043-4e038e4398a3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.297440 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec813763-b43e-4a53-a048-615c313d130a-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.626309 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" event={"ID":"8b119f36-1ae0-4826-8043-4e038e4398a3","Type":"ContainerDied","Data":"1e4ec87896f4f67be5e681ccd3095108f2f476916e6961dbb38f43ad494b15bd"} Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.626356 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e4ec87896f4f67be5e681ccd3095108f2f476916e6961dbb38f43ad494b15bd" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.626417 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.633100 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-x7p68" event={"ID":"ec813763-b43e-4a53-a048-615c313d130a","Type":"ContainerDied","Data":"4e7c83c397e29508263fdbaf7d23fb599b7b77f9c5a899c59d6869e9050302e7"} Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.633146 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e7c83c397e29508263fdbaf7d23fb599b7b77f9c5a899c59d6869e9050302e7" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.633233 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-x7p68" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.733625 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7"] Dec 08 09:28:08 crc kubenswrapper[4776]: E1208 09:28:08.734540 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec813763-b43e-4a53-a048-615c313d130a" containerName="aodh-db-sync" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.734574 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec813763-b43e-4a53-a048-615c313d130a" containerName="aodh-db-sync" Dec 08 09:28:08 crc kubenswrapper[4776]: E1208 09:28:08.734610 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b119f36-1ae0-4826-8043-4e038e4398a3" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.734623 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b119f36-1ae0-4826-8043-4e038e4398a3" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.735079 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec813763-b43e-4a53-a048-615c313d130a" containerName="aodh-db-sync" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.735111 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b119f36-1ae0-4826-8043-4e038e4398a3" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.736462 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.738548 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.739920 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.740311 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.740502 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.746956 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7"] Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.817085 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9419c01b-956b-4781-a8bf-e2e1472ad2cf-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xbms7\" (UID: \"9419c01b-956b-4781-a8bf-e2e1472ad2cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.817531 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hs9p\" (UniqueName: \"kubernetes.io/projected/9419c01b-956b-4781-a8bf-e2e1472ad2cf-kube-api-access-7hs9p\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xbms7\" (UID: \"9419c01b-956b-4781-a8bf-e2e1472ad2cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.817574 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9419c01b-956b-4781-a8bf-e2e1472ad2cf-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xbms7\" (UID: \"9419c01b-956b-4781-a8bf-e2e1472ad2cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.920340 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9419c01b-956b-4781-a8bf-e2e1472ad2cf-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xbms7\" (UID: \"9419c01b-956b-4781-a8bf-e2e1472ad2cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.920473 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hs9p\" (UniqueName: \"kubernetes.io/projected/9419c01b-956b-4781-a8bf-e2e1472ad2cf-kube-api-access-7hs9p\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xbms7\" (UID: \"9419c01b-956b-4781-a8bf-e2e1472ad2cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.920562 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9419c01b-956b-4781-a8bf-e2e1472ad2cf-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xbms7\" (UID: \"9419c01b-956b-4781-a8bf-e2e1472ad2cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.926981 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9419c01b-956b-4781-a8bf-e2e1472ad2cf-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xbms7\" (UID: \"9419c01b-956b-4781-a8bf-e2e1472ad2cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.933670 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9419c01b-956b-4781-a8bf-e2e1472ad2cf-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xbms7\" (UID: \"9419c01b-956b-4781-a8bf-e2e1472ad2cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" Dec 08 09:28:08 crc kubenswrapper[4776]: I1208 09:28:08.942601 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hs9p\" (UniqueName: \"kubernetes.io/projected/9419c01b-956b-4781-a8bf-e2e1472ad2cf-kube-api-access-7hs9p\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xbms7\" (UID: \"9419c01b-956b-4781-a8bf-e2e1472ad2cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" Dec 08 09:28:09 crc kubenswrapper[4776]: I1208 09:28:09.065025 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" Dec 08 09:28:09 crc kubenswrapper[4776]: I1208 09:28:09.655130 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7"] Dec 08 09:28:10 crc kubenswrapper[4776]: I1208 09:28:10.666369 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" event={"ID":"9419c01b-956b-4781-a8bf-e2e1472ad2cf","Type":"ContainerStarted","Data":"383255c2689533341929353a6a884c3caba0b5cca3ea9a1faa163bac3d8b920e"} Dec 08 09:28:10 crc kubenswrapper[4776]: I1208 09:28:10.666867 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" event={"ID":"9419c01b-956b-4781-a8bf-e2e1472ad2cf","Type":"ContainerStarted","Data":"6ec2e5aced00467792b6b58023d2a05c6252b05062c12c8049e31be07d75a862"} Dec 08 09:28:10 crc kubenswrapper[4776]: I1208 09:28:10.688200 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" podStartSLOduration=2.211026546 podStartE2EDuration="2.688165195s" podCreationTimestamp="2025-12-08 09:28:08 +0000 UTC" firstStartedPulling="2025-12-08 09:28:09.663768378 +0000 UTC m=+1765.926993400" lastFinishedPulling="2025-12-08 09:28:10.140906997 +0000 UTC m=+1766.404132049" observedRunningTime="2025-12-08 09:28:10.687574808 +0000 UTC m=+1766.950799850" watchObservedRunningTime="2025-12-08 09:28:10.688165195 +0000 UTC m=+1766.951390237" Dec 08 09:28:12 crc kubenswrapper[4776]: I1208 09:28:12.934152 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 08 09:28:12 crc kubenswrapper[4776]: I1208 09:28:12.938112 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e533f562-6dd5-4117-8d18-f2d222228480" containerName="aodh-api" containerID="cri-o://604f0b0b6825f943aa930e04542ae21e37a278b9e5e1123fed6619d9245ec23f" gracePeriod=30 Dec 08 09:28:12 crc kubenswrapper[4776]: I1208 09:28:12.938187 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e533f562-6dd5-4117-8d18-f2d222228480" containerName="aodh-notifier" containerID="cri-o://81d4884f440dc5ffde16685ff694e3ef6dfd968617a01d1bc7a0406b840060bf" gracePeriod=30 Dec 08 09:28:12 crc kubenswrapper[4776]: I1208 09:28:12.938168 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e533f562-6dd5-4117-8d18-f2d222228480" containerName="aodh-evaluator" containerID="cri-o://a75bd7e41751d9eb41786a0d1e3838a9db01099b40b5389c62ea420502eb97f9" gracePeriod=30 Dec 08 09:28:12 crc kubenswrapper[4776]: I1208 09:28:12.938241 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e533f562-6dd5-4117-8d18-f2d222228480" containerName="aodh-listener" containerID="cri-o://264a3383a36980d388776beba83c3af3e71c83b9da01be1a395720f49b840dbd" gracePeriod=30 Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.704963 4776 generic.go:334] "Generic (PLEG): container finished" podID="e533f562-6dd5-4117-8d18-f2d222228480" containerID="a75bd7e41751d9eb41786a0d1e3838a9db01099b40b5389c62ea420502eb97f9" exitCode=0 Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.705316 4776 generic.go:334] "Generic (PLEG): container finished" podID="e533f562-6dd5-4117-8d18-f2d222228480" containerID="604f0b0b6825f943aa930e04542ae21e37a278b9e5e1123fed6619d9245ec23f" exitCode=0 Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.705052 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e533f562-6dd5-4117-8d18-f2d222228480","Type":"ContainerDied","Data":"a75bd7e41751d9eb41786a0d1e3838a9db01099b40b5389c62ea420502eb97f9"} Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.705391 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e533f562-6dd5-4117-8d18-f2d222228480","Type":"ContainerDied","Data":"604f0b0b6825f943aa930e04542ae21e37a278b9e5e1123fed6619d9245ec23f"} Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.707132 4776 generic.go:334] "Generic (PLEG): container finished" podID="40b6ce41-e108-47bf-bc38-34e8c475b413" containerID="6aa12b16012116b86c2b0c97e985bec0ebaeb9a366799a44caf810d29f286456" exitCode=0 Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.707216 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5ccd9d555d-m9chd" event={"ID":"40b6ce41-e108-47bf-bc38-34e8c475b413","Type":"ContainerDied","Data":"6aa12b16012116b86c2b0c97e985bec0ebaeb9a366799a44caf810d29f286456"} Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.707250 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5ccd9d555d-m9chd" event={"ID":"40b6ce41-e108-47bf-bc38-34e8c475b413","Type":"ContainerDied","Data":"7bfdc9e480299420f87cbe5b57642b47aa8cc68291d1cae955a6e9e78baa4707"} Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.707264 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bfdc9e480299420f87cbe5b57642b47aa8cc68291d1cae955a6e9e78baa4707" Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.708625 4776 generic.go:334] "Generic (PLEG): container finished" podID="9419c01b-956b-4781-a8bf-e2e1472ad2cf" containerID="383255c2689533341929353a6a884c3caba0b5cca3ea9a1faa163bac3d8b920e" exitCode=0 Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.708664 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" event={"ID":"9419c01b-956b-4781-a8bf-e2e1472ad2cf","Type":"ContainerDied","Data":"383255c2689533341929353a6a884c3caba0b5cca3ea9a1faa163bac3d8b920e"} Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.758740 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.854968 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-config-data-custom\") pod \"40b6ce41-e108-47bf-bc38-34e8c475b413\" (UID: \"40b6ce41-e108-47bf-bc38-34e8c475b413\") " Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.855063 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-config-data\") pod \"40b6ce41-e108-47bf-bc38-34e8c475b413\" (UID: \"40b6ce41-e108-47bf-bc38-34e8c475b413\") " Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.855099 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-combined-ca-bundle\") pod \"40b6ce41-e108-47bf-bc38-34e8c475b413\" (UID: \"40b6ce41-e108-47bf-bc38-34e8c475b413\") " Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.855222 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qw7v\" (UniqueName: \"kubernetes.io/projected/40b6ce41-e108-47bf-bc38-34e8c475b413-kube-api-access-9qw7v\") pod \"40b6ce41-e108-47bf-bc38-34e8c475b413\" (UID: \"40b6ce41-e108-47bf-bc38-34e8c475b413\") " Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.861949 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40b6ce41-e108-47bf-bc38-34e8c475b413-kube-api-access-9qw7v" (OuterVolumeSpecName: "kube-api-access-9qw7v") pod "40b6ce41-e108-47bf-bc38-34e8c475b413" (UID: "40b6ce41-e108-47bf-bc38-34e8c475b413"). InnerVolumeSpecName "kube-api-access-9qw7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.864366 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "40b6ce41-e108-47bf-bc38-34e8c475b413" (UID: "40b6ce41-e108-47bf-bc38-34e8c475b413"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.899674 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40b6ce41-e108-47bf-bc38-34e8c475b413" (UID: "40b6ce41-e108-47bf-bc38-34e8c475b413"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.917840 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-config-data" (OuterVolumeSpecName: "config-data") pod "40b6ce41-e108-47bf-bc38-34e8c475b413" (UID: "40b6ce41-e108-47bf-bc38-34e8c475b413"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.926078 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.958854 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qw7v\" (UniqueName: \"kubernetes.io/projected/40b6ce41-e108-47bf-bc38-34e8c475b413-kube-api-access-9qw7v\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.958895 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.958906 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:13 crc kubenswrapper[4776]: I1208 09:28:13.958916 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b6ce41-e108-47bf-bc38-34e8c475b413-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:14 crc kubenswrapper[4776]: I1208 09:28:14.088751 4776 scope.go:117] "RemoveContainer" containerID="e4db7bb2f46397a744e23d7ff7362310822592c317b2c16c4bbc3a47b8ca9134" Dec 08 09:28:14 crc kubenswrapper[4776]: I1208 09:28:14.139768 4776 scope.go:117] "RemoveContainer" containerID="71bf9bcd196e3a24f9f5769f157fb8a856da68c96d13c89c446ba0afcf74d052" Dec 08 09:28:14 crc kubenswrapper[4776]: I1208 09:28:14.165741 4776 scope.go:117] "RemoveContainer" containerID="60f3369aeb65ec7d79d738bd4cc339f62dfff7705f56cba17ee799e904726704" Dec 08 09:28:14 crc kubenswrapper[4776]: I1208 09:28:14.719598 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5ccd9d555d-m9chd" Dec 08 09:28:14 crc kubenswrapper[4776]: I1208 09:28:14.756650 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5ccd9d555d-m9chd"] Dec 08 09:28:14 crc kubenswrapper[4776]: I1208 09:28:14.770694 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-5ccd9d555d-m9chd"] Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.041281 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.256944 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.296556 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9419c01b-956b-4781-a8bf-e2e1472ad2cf-ssh-key\") pod \"9419c01b-956b-4781-a8bf-e2e1472ad2cf\" (UID: \"9419c01b-956b-4781-a8bf-e2e1472ad2cf\") " Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.297405 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hs9p\" (UniqueName: \"kubernetes.io/projected/9419c01b-956b-4781-a8bf-e2e1472ad2cf-kube-api-access-7hs9p\") pod \"9419c01b-956b-4781-a8bf-e2e1472ad2cf\" (UID: \"9419c01b-956b-4781-a8bf-e2e1472ad2cf\") " Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.297444 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9419c01b-956b-4781-a8bf-e2e1472ad2cf-inventory\") pod \"9419c01b-956b-4781-a8bf-e2e1472ad2cf\" (UID: \"9419c01b-956b-4781-a8bf-e2e1472ad2cf\") " Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.316537 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9419c01b-956b-4781-a8bf-e2e1472ad2cf-kube-api-access-7hs9p" (OuterVolumeSpecName: "kube-api-access-7hs9p") pod "9419c01b-956b-4781-a8bf-e2e1472ad2cf" (UID: "9419c01b-956b-4781-a8bf-e2e1472ad2cf"). InnerVolumeSpecName "kube-api-access-7hs9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.371782 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9419c01b-956b-4781-a8bf-e2e1472ad2cf-inventory" (OuterVolumeSpecName: "inventory") pod "9419c01b-956b-4781-a8bf-e2e1472ad2cf" (UID: "9419c01b-956b-4781-a8bf-e2e1472ad2cf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.388381 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9419c01b-956b-4781-a8bf-e2e1472ad2cf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9419c01b-956b-4781-a8bf-e2e1472ad2cf" (UID: "9419c01b-956b-4781-a8bf-e2e1472ad2cf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.400307 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hs9p\" (UniqueName: \"kubernetes.io/projected/9419c01b-956b-4781-a8bf-e2e1472ad2cf-kube-api-access-7hs9p\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.400331 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9419c01b-956b-4781-a8bf-e2e1472ad2cf-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.400340 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9419c01b-956b-4781-a8bf-e2e1472ad2cf-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.734652 4776 generic.go:334] "Generic (PLEG): container finished" podID="e533f562-6dd5-4117-8d18-f2d222228480" containerID="264a3383a36980d388776beba83c3af3e71c83b9da01be1a395720f49b840dbd" exitCode=0 Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.734743 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e533f562-6dd5-4117-8d18-f2d222228480","Type":"ContainerDied","Data":"264a3383a36980d388776beba83c3af3e71c83b9da01be1a395720f49b840dbd"} Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.736835 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" event={"ID":"9419c01b-956b-4781-a8bf-e2e1472ad2cf","Type":"ContainerDied","Data":"6ec2e5aced00467792b6b58023d2a05c6252b05062c12c8049e31be07d75a862"} Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.736888 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ec2e5aced00467792b6b58023d2a05c6252b05062c12c8049e31be07d75a862" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.736970 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xbms7" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.824310 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4"] Dec 08 09:28:15 crc kubenswrapper[4776]: E1208 09:28:15.824839 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b6ce41-e108-47bf-bc38-34e8c475b413" containerName="heat-engine" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.824856 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b6ce41-e108-47bf-bc38-34e8c475b413" containerName="heat-engine" Dec 08 09:28:15 crc kubenswrapper[4776]: E1208 09:28:15.824892 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9419c01b-956b-4781-a8bf-e2e1472ad2cf" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.824900 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9419c01b-956b-4781-a8bf-e2e1472ad2cf" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.825102 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9419c01b-956b-4781-a8bf-e2e1472ad2cf" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.825122 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b6ce41-e108-47bf-bc38-34e8c475b413" containerName="heat-engine" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.826000 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.828070 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.828769 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.829742 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.830332 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.835051 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4"] Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.914604 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs2cw\" (UniqueName: \"kubernetes.io/projected/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-kube-api-access-bs2cw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4\" (UID: \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.915017 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4\" (UID: \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.915079 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4\" (UID: \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" Dec 08 09:28:15 crc kubenswrapper[4776]: I1208 09:28:15.915304 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4\" (UID: \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.017804 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4\" (UID: \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.018145 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4\" (UID: \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.018247 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4\" (UID: \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.018279 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs2cw\" (UniqueName: \"kubernetes.io/projected/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-kube-api-access-bs2cw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4\" (UID: \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.023232 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4\" (UID: \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.023239 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4\" (UID: \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.023349 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4\" (UID: \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.035932 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs2cw\" (UniqueName: \"kubernetes.io/projected/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-kube-api-access-bs2cw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4\" (UID: \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.147808 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.300898 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.324820 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6zgx\" (UniqueName: \"kubernetes.io/projected/e533f562-6dd5-4117-8d18-f2d222228480-kube-api-access-q6zgx\") pod \"e533f562-6dd5-4117-8d18-f2d222228480\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.324898 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-combined-ca-bundle\") pod \"e533f562-6dd5-4117-8d18-f2d222228480\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.324922 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-scripts\") pod \"e533f562-6dd5-4117-8d18-f2d222228480\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.324962 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-config-data\") pod \"e533f562-6dd5-4117-8d18-f2d222228480\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.325068 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-internal-tls-certs\") pod \"e533f562-6dd5-4117-8d18-f2d222228480\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.325097 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-public-tls-certs\") pod \"e533f562-6dd5-4117-8d18-f2d222228480\" (UID: \"e533f562-6dd5-4117-8d18-f2d222228480\") " Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.331610 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-scripts" (OuterVolumeSpecName: "scripts") pod "e533f562-6dd5-4117-8d18-f2d222228480" (UID: "e533f562-6dd5-4117-8d18-f2d222228480"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.345454 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e533f562-6dd5-4117-8d18-f2d222228480-kube-api-access-q6zgx" (OuterVolumeSpecName: "kube-api-access-q6zgx") pod "e533f562-6dd5-4117-8d18-f2d222228480" (UID: "e533f562-6dd5-4117-8d18-f2d222228480"). InnerVolumeSpecName "kube-api-access-q6zgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.385853 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40b6ce41-e108-47bf-bc38-34e8c475b413" path="/var/lib/kubelet/pods/40b6ce41-e108-47bf-bc38-34e8c475b413/volumes" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.427126 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6zgx\" (UniqueName: \"kubernetes.io/projected/e533f562-6dd5-4117-8d18-f2d222228480-kube-api-access-q6zgx\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.427156 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.457462 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e533f562-6dd5-4117-8d18-f2d222228480" (UID: "e533f562-6dd5-4117-8d18-f2d222228480"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.486409 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e533f562-6dd5-4117-8d18-f2d222228480" (UID: "e533f562-6dd5-4117-8d18-f2d222228480"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.494048 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e533f562-6dd5-4117-8d18-f2d222228480" (UID: "e533f562-6dd5-4117-8d18-f2d222228480"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.507431 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-config-data" (OuterVolumeSpecName: "config-data") pod "e533f562-6dd5-4117-8d18-f2d222228480" (UID: "e533f562-6dd5-4117-8d18-f2d222228480"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.528440 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.528474 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.528483 4776 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.528492 4776 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e533f562-6dd5-4117-8d18-f2d222228480-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.757803 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4"] Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.759968 4776 generic.go:334] "Generic (PLEG): container finished" podID="e533f562-6dd5-4117-8d18-f2d222228480" containerID="81d4884f440dc5ffde16685ff694e3ef6dfd968617a01d1bc7a0406b840060bf" exitCode=0 Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.760025 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e533f562-6dd5-4117-8d18-f2d222228480","Type":"ContainerDied","Data":"81d4884f440dc5ffde16685ff694e3ef6dfd968617a01d1bc7a0406b840060bf"} Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.760076 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e533f562-6dd5-4117-8d18-f2d222228480","Type":"ContainerDied","Data":"d8e7588a5e6e682fb545c0ec007078c0a95599587bc3b61da7ad6abc04d2c2ae"} Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.760100 4776 scope.go:117] "RemoveContainer" containerID="264a3383a36980d388776beba83c3af3e71c83b9da01be1a395720f49b840dbd" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.760034 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: W1208 09:28:16.774766 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2304e249_86bc_4b0a_a222_e8c2ba39a0bb.slice/crio-237ab68c88e2ef274f74cb3d0be5842e5c803d3febcecf471da2af9b40a374ff WatchSource:0}: Error finding container 237ab68c88e2ef274f74cb3d0be5842e5c803d3febcecf471da2af9b40a374ff: Status 404 returned error can't find the container with id 237ab68c88e2ef274f74cb3d0be5842e5c803d3febcecf471da2af9b40a374ff Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.789113 4776 scope.go:117] "RemoveContainer" containerID="81d4884f440dc5ffde16685ff694e3ef6dfd968617a01d1bc7a0406b840060bf" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.798293 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.808733 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.822806 4776 scope.go:117] "RemoveContainer" containerID="a75bd7e41751d9eb41786a0d1e3838a9db01099b40b5389c62ea420502eb97f9" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.824682 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 08 09:28:16 crc kubenswrapper[4776]: E1208 09:28:16.825251 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e533f562-6dd5-4117-8d18-f2d222228480" containerName="aodh-listener" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.825265 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e533f562-6dd5-4117-8d18-f2d222228480" containerName="aodh-listener" Dec 08 09:28:16 crc kubenswrapper[4776]: E1208 09:28:16.825274 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e533f562-6dd5-4117-8d18-f2d222228480" containerName="aodh-notifier" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.825281 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e533f562-6dd5-4117-8d18-f2d222228480" containerName="aodh-notifier" Dec 08 09:28:16 crc kubenswrapper[4776]: E1208 09:28:16.825300 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e533f562-6dd5-4117-8d18-f2d222228480" containerName="aodh-evaluator" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.825305 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e533f562-6dd5-4117-8d18-f2d222228480" containerName="aodh-evaluator" Dec 08 09:28:16 crc kubenswrapper[4776]: E1208 09:28:16.825333 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e533f562-6dd5-4117-8d18-f2d222228480" containerName="aodh-api" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.825339 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e533f562-6dd5-4117-8d18-f2d222228480" containerName="aodh-api" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.825555 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e533f562-6dd5-4117-8d18-f2d222228480" containerName="aodh-notifier" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.825575 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e533f562-6dd5-4117-8d18-f2d222228480" containerName="aodh-listener" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.825593 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e533f562-6dd5-4117-8d18-f2d222228480" containerName="aodh-api" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.825609 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e533f562-6dd5-4117-8d18-f2d222228480" containerName="aodh-evaluator" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.828028 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.833712 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-rtn2h" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.833961 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.834074 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.834207 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.834330 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.836502 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.843158 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a024b29b-cad1-489c-88ea-efc9558b2da0-public-tls-certs\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.843289 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a024b29b-cad1-489c-88ea-efc9558b2da0-scripts\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.843314 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a024b29b-cad1-489c-88ea-efc9558b2da0-internal-tls-certs\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.843442 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwl4v\" (UniqueName: \"kubernetes.io/projected/a024b29b-cad1-489c-88ea-efc9558b2da0-kube-api-access-zwl4v\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.843536 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a024b29b-cad1-489c-88ea-efc9558b2da0-config-data\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.843578 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a024b29b-cad1-489c-88ea-efc9558b2da0-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.879937 4776 scope.go:117] "RemoveContainer" containerID="604f0b0b6825f943aa930e04542ae21e37a278b9e5e1123fed6619d9245ec23f" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.902525 4776 scope.go:117] "RemoveContainer" containerID="264a3383a36980d388776beba83c3af3e71c83b9da01be1a395720f49b840dbd" Dec 08 09:28:16 crc kubenswrapper[4776]: E1208 09:28:16.903034 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"264a3383a36980d388776beba83c3af3e71c83b9da01be1a395720f49b840dbd\": container with ID starting with 264a3383a36980d388776beba83c3af3e71c83b9da01be1a395720f49b840dbd not found: ID does not exist" containerID="264a3383a36980d388776beba83c3af3e71c83b9da01be1a395720f49b840dbd" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.903085 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"264a3383a36980d388776beba83c3af3e71c83b9da01be1a395720f49b840dbd"} err="failed to get container status \"264a3383a36980d388776beba83c3af3e71c83b9da01be1a395720f49b840dbd\": rpc error: code = NotFound desc = could not find container \"264a3383a36980d388776beba83c3af3e71c83b9da01be1a395720f49b840dbd\": container with ID starting with 264a3383a36980d388776beba83c3af3e71c83b9da01be1a395720f49b840dbd not found: ID does not exist" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.903118 4776 scope.go:117] "RemoveContainer" containerID="81d4884f440dc5ffde16685ff694e3ef6dfd968617a01d1bc7a0406b840060bf" Dec 08 09:28:16 crc kubenswrapper[4776]: E1208 09:28:16.903428 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81d4884f440dc5ffde16685ff694e3ef6dfd968617a01d1bc7a0406b840060bf\": container with ID starting with 81d4884f440dc5ffde16685ff694e3ef6dfd968617a01d1bc7a0406b840060bf not found: ID does not exist" containerID="81d4884f440dc5ffde16685ff694e3ef6dfd968617a01d1bc7a0406b840060bf" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.903476 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d4884f440dc5ffde16685ff694e3ef6dfd968617a01d1bc7a0406b840060bf"} err="failed to get container status \"81d4884f440dc5ffde16685ff694e3ef6dfd968617a01d1bc7a0406b840060bf\": rpc error: code = NotFound desc = could not find container \"81d4884f440dc5ffde16685ff694e3ef6dfd968617a01d1bc7a0406b840060bf\": container with ID starting with 81d4884f440dc5ffde16685ff694e3ef6dfd968617a01d1bc7a0406b840060bf not found: ID does not exist" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.903505 4776 scope.go:117] "RemoveContainer" containerID="a75bd7e41751d9eb41786a0d1e3838a9db01099b40b5389c62ea420502eb97f9" Dec 08 09:28:16 crc kubenswrapper[4776]: E1208 09:28:16.903995 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a75bd7e41751d9eb41786a0d1e3838a9db01099b40b5389c62ea420502eb97f9\": container with ID starting with a75bd7e41751d9eb41786a0d1e3838a9db01099b40b5389c62ea420502eb97f9 not found: ID does not exist" containerID="a75bd7e41751d9eb41786a0d1e3838a9db01099b40b5389c62ea420502eb97f9" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.904028 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a75bd7e41751d9eb41786a0d1e3838a9db01099b40b5389c62ea420502eb97f9"} err="failed to get container status \"a75bd7e41751d9eb41786a0d1e3838a9db01099b40b5389c62ea420502eb97f9\": rpc error: code = NotFound desc = could not find container \"a75bd7e41751d9eb41786a0d1e3838a9db01099b40b5389c62ea420502eb97f9\": container with ID starting with a75bd7e41751d9eb41786a0d1e3838a9db01099b40b5389c62ea420502eb97f9 not found: ID does not exist" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.904066 4776 scope.go:117] "RemoveContainer" containerID="604f0b0b6825f943aa930e04542ae21e37a278b9e5e1123fed6619d9245ec23f" Dec 08 09:28:16 crc kubenswrapper[4776]: E1208 09:28:16.904391 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"604f0b0b6825f943aa930e04542ae21e37a278b9e5e1123fed6619d9245ec23f\": container with ID starting with 604f0b0b6825f943aa930e04542ae21e37a278b9e5e1123fed6619d9245ec23f not found: ID does not exist" containerID="604f0b0b6825f943aa930e04542ae21e37a278b9e5e1123fed6619d9245ec23f" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.904436 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604f0b0b6825f943aa930e04542ae21e37a278b9e5e1123fed6619d9245ec23f"} err="failed to get container status \"604f0b0b6825f943aa930e04542ae21e37a278b9e5e1123fed6619d9245ec23f\": rpc error: code = NotFound desc = could not find container \"604f0b0b6825f943aa930e04542ae21e37a278b9e5e1123fed6619d9245ec23f\": container with ID starting with 604f0b0b6825f943aa930e04542ae21e37a278b9e5e1123fed6619d9245ec23f not found: ID does not exist" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.944335 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a024b29b-cad1-489c-88ea-efc9558b2da0-scripts\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.944377 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a024b29b-cad1-489c-88ea-efc9558b2da0-internal-tls-certs\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.944467 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwl4v\" (UniqueName: \"kubernetes.io/projected/a024b29b-cad1-489c-88ea-efc9558b2da0-kube-api-access-zwl4v\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.944514 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a024b29b-cad1-489c-88ea-efc9558b2da0-config-data\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.944537 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a024b29b-cad1-489c-88ea-efc9558b2da0-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.944560 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a024b29b-cad1-489c-88ea-efc9558b2da0-public-tls-certs\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.952121 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a024b29b-cad1-489c-88ea-efc9558b2da0-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.953184 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a024b29b-cad1-489c-88ea-efc9558b2da0-config-data\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.953418 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a024b29b-cad1-489c-88ea-efc9558b2da0-internal-tls-certs\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.954277 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a024b29b-cad1-489c-88ea-efc9558b2da0-scripts\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.957308 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a024b29b-cad1-489c-88ea-efc9558b2da0-public-tls-certs\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:16 crc kubenswrapper[4776]: I1208 09:28:16.969085 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwl4v\" (UniqueName: \"kubernetes.io/projected/a024b29b-cad1-489c-88ea-efc9558b2da0-kube-api-access-zwl4v\") pod \"aodh-0\" (UID: \"a024b29b-cad1-489c-88ea-efc9558b2da0\") " pod="openstack/aodh-0" Dec 08 09:28:17 crc kubenswrapper[4776]: I1208 09:28:17.154006 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 08 09:28:17 crc kubenswrapper[4776]: W1208 09:28:17.687081 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda024b29b_cad1_489c_88ea_efc9558b2da0.slice/crio-db53985e5bbc10571daa4e5bccfe82760e76a9a8a943f46511fa79863127be13 WatchSource:0}: Error finding container db53985e5bbc10571daa4e5bccfe82760e76a9a8a943f46511fa79863127be13: Status 404 returned error can't find the container with id db53985e5bbc10571daa4e5bccfe82760e76a9a8a943f46511fa79863127be13 Dec 08 09:28:17 crc kubenswrapper[4776]: I1208 09:28:17.687731 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 08 09:28:17 crc kubenswrapper[4776]: I1208 09:28:17.771873 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" event={"ID":"2304e249-86bc-4b0a-a222-e8c2ba39a0bb","Type":"ContainerStarted","Data":"33a4358b8b865e34fc629cb2ecfb9c330a59bd88eee0ef8bd437671924a2ffe6"} Dec 08 09:28:17 crc kubenswrapper[4776]: I1208 09:28:17.771915 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" event={"ID":"2304e249-86bc-4b0a-a222-e8c2ba39a0bb","Type":"ContainerStarted","Data":"237ab68c88e2ef274f74cb3d0be5842e5c803d3febcecf471da2af9b40a374ff"} Dec 08 09:28:17 crc kubenswrapper[4776]: I1208 09:28:17.774484 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a024b29b-cad1-489c-88ea-efc9558b2da0","Type":"ContainerStarted","Data":"db53985e5bbc10571daa4e5bccfe82760e76a9a8a943f46511fa79863127be13"} Dec 08 09:28:17 crc kubenswrapper[4776]: I1208 09:28:17.795643 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" podStartSLOduration=2.309088833 podStartE2EDuration="2.795625794s" podCreationTimestamp="2025-12-08 09:28:15 +0000 UTC" firstStartedPulling="2025-12-08 09:28:16.789379687 +0000 UTC m=+1773.052604709" lastFinishedPulling="2025-12-08 09:28:17.275916648 +0000 UTC m=+1773.539141670" observedRunningTime="2025-12-08 09:28:17.784880105 +0000 UTC m=+1774.048105127" watchObservedRunningTime="2025-12-08 09:28:17.795625794 +0000 UTC m=+1774.058850816" Dec 08 09:28:18 crc kubenswrapper[4776]: I1208 09:28:18.376644 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e533f562-6dd5-4117-8d18-f2d222228480" path="/var/lib/kubelet/pods/e533f562-6dd5-4117-8d18-f2d222228480/volumes" Dec 08 09:28:18 crc kubenswrapper[4776]: I1208 09:28:18.800748 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a024b29b-cad1-489c-88ea-efc9558b2da0","Type":"ContainerStarted","Data":"2c93ccd691164704cb437cbed7877ebf2be276f711bb7045bc9d5b50b385c359"} Dec 08 09:28:19 crc kubenswrapper[4776]: I1208 09:28:19.816588 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a024b29b-cad1-489c-88ea-efc9558b2da0","Type":"ContainerStarted","Data":"14c0cf8b74c6e62e869056036d3eed691d68861250e3a4cfcf9204e2d93406e0"} Dec 08 09:28:20 crc kubenswrapper[4776]: I1208 09:28:20.830966 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a024b29b-cad1-489c-88ea-efc9558b2da0","Type":"ContainerStarted","Data":"c5172aa99001ec1f7df7a9a14b0f2424bc00a55ba2f816f19fde3f40b7f489a9"} Dec 08 09:28:21 crc kubenswrapper[4776]: I1208 09:28:21.343423 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:28:21 crc kubenswrapper[4776]: E1208 09:28:21.343820 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:28:21 crc kubenswrapper[4776]: I1208 09:28:21.847168 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a024b29b-cad1-489c-88ea-efc9558b2da0","Type":"ContainerStarted","Data":"ebe39b73cc2aecba0e316f7bfd7a5bda439c9323c6b0273bba702d039210b49e"} Dec 08 09:28:21 crc kubenswrapper[4776]: I1208 09:28:21.882444 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.059690346 podStartE2EDuration="5.88241762s" podCreationTimestamp="2025-12-08 09:28:16 +0000 UTC" firstStartedPulling="2025-12-08 09:28:17.690155084 +0000 UTC m=+1773.953380096" lastFinishedPulling="2025-12-08 09:28:21.512882348 +0000 UTC m=+1777.776107370" observedRunningTime="2025-12-08 09:28:21.874233729 +0000 UTC m=+1778.137458771" watchObservedRunningTime="2025-12-08 09:28:21.88241762 +0000 UTC m=+1778.145642652" Dec 08 09:28:35 crc kubenswrapper[4776]: I1208 09:28:35.344235 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:28:35 crc kubenswrapper[4776]: E1208 09:28:35.345041 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:28:47 crc kubenswrapper[4776]: I1208 09:28:47.344933 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:28:47 crc kubenswrapper[4776]: E1208 09:28:47.346205 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:29:01 crc kubenswrapper[4776]: I1208 09:29:01.344447 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:29:01 crc kubenswrapper[4776]: E1208 09:29:01.345315 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:29:14 crc kubenswrapper[4776]: I1208 09:29:14.341548 4776 scope.go:117] "RemoveContainer" containerID="8833fa7626a819ff1247acadca4b9b2a9355a6d206fca9d0ed67d2f0cae0998b" Dec 08 09:29:14 crc kubenswrapper[4776]: I1208 09:29:14.359270 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:29:14 crc kubenswrapper[4776]: E1208 09:29:14.360641 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:29:14 crc kubenswrapper[4776]: I1208 09:29:14.385527 4776 scope.go:117] "RemoveContainer" containerID="9fe6f989e54710e3f545abe625646a7377c16128fb3ee99a4312085f9bc6cc7b" Dec 08 09:29:14 crc kubenswrapper[4776]: I1208 09:29:14.432854 4776 scope.go:117] "RemoveContainer" containerID="6ac391ca9825822a2c218f1f30e3333d2154ab78fce7cfd6da43a7327afe0a0d" Dec 08 09:29:14 crc kubenswrapper[4776]: I1208 09:29:14.475799 4776 scope.go:117] "RemoveContainer" containerID="b87dab899deb8f6d16bdbf6c9ef247958c47eecf16405da6fccc045cbf52b0d0" Dec 08 09:29:14 crc kubenswrapper[4776]: I1208 09:29:14.510371 4776 scope.go:117] "RemoveContainer" containerID="5eceb729a54e6a551f0a4e42d035d262d6b36f74757166a5eac6738b8de66051" Dec 08 09:29:28 crc kubenswrapper[4776]: I1208 09:29:28.344128 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:29:28 crc kubenswrapper[4776]: E1208 09:29:28.344893 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:29:42 crc kubenswrapper[4776]: I1208 09:29:42.344380 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:29:42 crc kubenswrapper[4776]: E1208 09:29:42.345025 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:29:57 crc kubenswrapper[4776]: I1208 09:29:57.343875 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:29:57 crc kubenswrapper[4776]: E1208 09:29:57.344611 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:30:00 crc kubenswrapper[4776]: I1208 09:30:00.170007 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q"] Dec 08 09:30:00 crc kubenswrapper[4776]: I1208 09:30:00.177349 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q" Dec 08 09:30:00 crc kubenswrapper[4776]: I1208 09:30:00.181286 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 08 09:30:00 crc kubenswrapper[4776]: I1208 09:30:00.181964 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 08 09:30:00 crc kubenswrapper[4776]: I1208 09:30:00.189928 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q"] Dec 08 09:30:00 crc kubenswrapper[4776]: I1208 09:30:00.245076 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54152b91-017c-44e9-8c79-f0cf0befb065-secret-volume\") pod \"collect-profiles-29419770-58r5q\" (UID: \"54152b91-017c-44e9-8c79-f0cf0befb065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q" Dec 08 09:30:00 crc kubenswrapper[4776]: I1208 09:30:00.245138 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxvg8\" (UniqueName: \"kubernetes.io/projected/54152b91-017c-44e9-8c79-f0cf0befb065-kube-api-access-sxvg8\") pod \"collect-profiles-29419770-58r5q\" (UID: \"54152b91-017c-44e9-8c79-f0cf0befb065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q" Dec 08 09:30:00 crc kubenswrapper[4776]: I1208 09:30:00.245534 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54152b91-017c-44e9-8c79-f0cf0befb065-config-volume\") pod \"collect-profiles-29419770-58r5q\" (UID: \"54152b91-017c-44e9-8c79-f0cf0befb065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q" Dec 08 09:30:00 crc kubenswrapper[4776]: I1208 09:30:00.347571 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxvg8\" (UniqueName: \"kubernetes.io/projected/54152b91-017c-44e9-8c79-f0cf0befb065-kube-api-access-sxvg8\") pod \"collect-profiles-29419770-58r5q\" (UID: \"54152b91-017c-44e9-8c79-f0cf0befb065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q" Dec 08 09:30:00 crc kubenswrapper[4776]: I1208 09:30:00.347998 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54152b91-017c-44e9-8c79-f0cf0befb065-config-volume\") pod \"collect-profiles-29419770-58r5q\" (UID: \"54152b91-017c-44e9-8c79-f0cf0befb065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q" Dec 08 09:30:00 crc kubenswrapper[4776]: I1208 09:30:00.348315 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54152b91-017c-44e9-8c79-f0cf0befb065-secret-volume\") pod \"collect-profiles-29419770-58r5q\" (UID: \"54152b91-017c-44e9-8c79-f0cf0befb065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q" Dec 08 09:30:00 crc kubenswrapper[4776]: I1208 09:30:00.349673 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54152b91-017c-44e9-8c79-f0cf0befb065-config-volume\") pod \"collect-profiles-29419770-58r5q\" (UID: \"54152b91-017c-44e9-8c79-f0cf0befb065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q" Dec 08 09:30:00 crc kubenswrapper[4776]: I1208 09:30:00.363506 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54152b91-017c-44e9-8c79-f0cf0befb065-secret-volume\") pod \"collect-profiles-29419770-58r5q\" (UID: \"54152b91-017c-44e9-8c79-f0cf0befb065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q" Dec 08 09:30:00 crc kubenswrapper[4776]: I1208 09:30:00.364145 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxvg8\" (UniqueName: \"kubernetes.io/projected/54152b91-017c-44e9-8c79-f0cf0befb065-kube-api-access-sxvg8\") pod \"collect-profiles-29419770-58r5q\" (UID: \"54152b91-017c-44e9-8c79-f0cf0befb065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q" Dec 08 09:30:00 crc kubenswrapper[4776]: I1208 09:30:00.506878 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q" Dec 08 09:30:00 crc kubenswrapper[4776]: I1208 09:30:00.984845 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q"] Dec 08 09:30:01 crc kubenswrapper[4776]: I1208 09:30:01.511753 4776 generic.go:334] "Generic (PLEG): container finished" podID="54152b91-017c-44e9-8c79-f0cf0befb065" containerID="61e1263b8f4ce9bcca325447358f5ff8611804f1cac07fd9c2079efc299187e8" exitCode=0 Dec 08 09:30:01 crc kubenswrapper[4776]: I1208 09:30:01.511798 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q" event={"ID":"54152b91-017c-44e9-8c79-f0cf0befb065","Type":"ContainerDied","Data":"61e1263b8f4ce9bcca325447358f5ff8611804f1cac07fd9c2079efc299187e8"} Dec 08 09:30:01 crc kubenswrapper[4776]: I1208 09:30:01.512136 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q" event={"ID":"54152b91-017c-44e9-8c79-f0cf0befb065","Type":"ContainerStarted","Data":"f27c3133a70091dd5cbd4e3eef87964f8df4d5ce228d28c58cf24e03af5aa9d9"} Dec 08 09:30:02 crc kubenswrapper[4776]: I1208 09:30:02.942441 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q" Dec 08 09:30:03 crc kubenswrapper[4776]: I1208 09:30:03.020093 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54152b91-017c-44e9-8c79-f0cf0befb065-secret-volume\") pod \"54152b91-017c-44e9-8c79-f0cf0befb065\" (UID: \"54152b91-017c-44e9-8c79-f0cf0befb065\") " Dec 08 09:30:03 crc kubenswrapper[4776]: I1208 09:30:03.020351 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54152b91-017c-44e9-8c79-f0cf0befb065-config-volume\") pod \"54152b91-017c-44e9-8c79-f0cf0befb065\" (UID: \"54152b91-017c-44e9-8c79-f0cf0befb065\") " Dec 08 09:30:03 crc kubenswrapper[4776]: I1208 09:30:03.020535 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxvg8\" (UniqueName: \"kubernetes.io/projected/54152b91-017c-44e9-8c79-f0cf0befb065-kube-api-access-sxvg8\") pod \"54152b91-017c-44e9-8c79-f0cf0befb065\" (UID: \"54152b91-017c-44e9-8c79-f0cf0befb065\") " Dec 08 09:30:03 crc kubenswrapper[4776]: I1208 09:30:03.021500 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54152b91-017c-44e9-8c79-f0cf0befb065-config-volume" (OuterVolumeSpecName: "config-volume") pod "54152b91-017c-44e9-8c79-f0cf0befb065" (UID: "54152b91-017c-44e9-8c79-f0cf0befb065"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:30:03 crc kubenswrapper[4776]: I1208 09:30:03.069837 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54152b91-017c-44e9-8c79-f0cf0befb065-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "54152b91-017c-44e9-8c79-f0cf0befb065" (UID: "54152b91-017c-44e9-8c79-f0cf0befb065"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:30:03 crc kubenswrapper[4776]: I1208 09:30:03.069850 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54152b91-017c-44e9-8c79-f0cf0befb065-kube-api-access-sxvg8" (OuterVolumeSpecName: "kube-api-access-sxvg8") pod "54152b91-017c-44e9-8c79-f0cf0befb065" (UID: "54152b91-017c-44e9-8c79-f0cf0befb065"). InnerVolumeSpecName "kube-api-access-sxvg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:30:03 crc kubenswrapper[4776]: I1208 09:30:03.122562 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54152b91-017c-44e9-8c79-f0cf0befb065-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 08 09:30:03 crc kubenswrapper[4776]: I1208 09:30:03.122594 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54152b91-017c-44e9-8c79-f0cf0befb065-config-volume\") on node \"crc\" DevicePath \"\"" Dec 08 09:30:03 crc kubenswrapper[4776]: I1208 09:30:03.122605 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxvg8\" (UniqueName: \"kubernetes.io/projected/54152b91-017c-44e9-8c79-f0cf0befb065-kube-api-access-sxvg8\") on node \"crc\" DevicePath \"\"" Dec 08 09:30:03 crc kubenswrapper[4776]: I1208 09:30:03.542004 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q" event={"ID":"54152b91-017c-44e9-8c79-f0cf0befb065","Type":"ContainerDied","Data":"f27c3133a70091dd5cbd4e3eef87964f8df4d5ce228d28c58cf24e03af5aa9d9"} Dec 08 09:30:03 crc kubenswrapper[4776]: I1208 09:30:03.542053 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f27c3133a70091dd5cbd4e3eef87964f8df4d5ce228d28c58cf24e03af5aa9d9" Dec 08 09:30:03 crc kubenswrapper[4776]: I1208 09:30:03.542127 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q" Dec 08 09:30:08 crc kubenswrapper[4776]: I1208 09:30:08.343639 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:30:08 crc kubenswrapper[4776]: E1208 09:30:08.345588 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:30:14 crc kubenswrapper[4776]: I1208 09:30:14.688035 4776 scope.go:117] "RemoveContainer" containerID="f4ea81c1c16b42e732fc7c8826f147f7d03faf0335b9cdfcafbeb2c392206d3d" Dec 08 09:30:14 crc kubenswrapper[4776]: I1208 09:30:14.726036 4776 scope.go:117] "RemoveContainer" containerID="8dd0b38b6b507cba9588499d3306ebee7fa547826a872bdce4fd32cf701505f0" Dec 08 09:30:22 crc kubenswrapper[4776]: I1208 09:30:22.344435 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:30:22 crc kubenswrapper[4776]: I1208 09:30:22.779601 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"7d8d2a2902024cbe40e4f1f7b98f5ca2ce52445ed3e9c1d7d0e196722965b4ad"} Dec 08 09:31:14 crc kubenswrapper[4776]: I1208 09:31:14.817439 4776 scope.go:117] "RemoveContainer" containerID="3b0ac0b8fd0ac6f80f7fc079e316e648e1a204fe4a69bd0cda6755200789a450" Dec 08 09:31:14 crc kubenswrapper[4776]: I1208 09:31:14.966026 4776 scope.go:117] "RemoveContainer" containerID="6aa12b16012116b86c2b0c97e985bec0ebaeb9a366799a44caf810d29f286456" Dec 08 09:31:36 crc kubenswrapper[4776]: I1208 09:31:36.559094 4776 generic.go:334] "Generic (PLEG): container finished" podID="2304e249-86bc-4b0a-a222-e8c2ba39a0bb" containerID="33a4358b8b865e34fc629cb2ecfb9c330a59bd88eee0ef8bd437671924a2ffe6" exitCode=0 Dec 08 09:31:36 crc kubenswrapper[4776]: I1208 09:31:36.559217 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" event={"ID":"2304e249-86bc-4b0a-a222-e8c2ba39a0bb","Type":"ContainerDied","Data":"33a4358b8b865e34fc629cb2ecfb9c330a59bd88eee0ef8bd437671924a2ffe6"} Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.026538 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.155799 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-ssh-key\") pod \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\" (UID: \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\") " Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.156215 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs2cw\" (UniqueName: \"kubernetes.io/projected/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-kube-api-access-bs2cw\") pod \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\" (UID: \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\") " Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.157413 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-bootstrap-combined-ca-bundle\") pod \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\" (UID: \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\") " Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.157465 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-inventory\") pod \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\" (UID: \"2304e249-86bc-4b0a-a222-e8c2ba39a0bb\") " Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.163797 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2304e249-86bc-4b0a-a222-e8c2ba39a0bb" (UID: "2304e249-86bc-4b0a-a222-e8c2ba39a0bb"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.166077 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-kube-api-access-bs2cw" (OuterVolumeSpecName: "kube-api-access-bs2cw") pod "2304e249-86bc-4b0a-a222-e8c2ba39a0bb" (UID: "2304e249-86bc-4b0a-a222-e8c2ba39a0bb"). InnerVolumeSpecName "kube-api-access-bs2cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.210790 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2304e249-86bc-4b0a-a222-e8c2ba39a0bb" (UID: "2304e249-86bc-4b0a-a222-e8c2ba39a0bb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.219289 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-inventory" (OuterVolumeSpecName: "inventory") pod "2304e249-86bc-4b0a-a222-e8c2ba39a0bb" (UID: "2304e249-86bc-4b0a-a222-e8c2ba39a0bb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.260484 4776 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.260517 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.260526 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.260538 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs2cw\" (UniqueName: \"kubernetes.io/projected/2304e249-86bc-4b0a-a222-e8c2ba39a0bb-kube-api-access-bs2cw\") on node \"crc\" DevicePath \"\"" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.585118 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" event={"ID":"2304e249-86bc-4b0a-a222-e8c2ba39a0bb","Type":"ContainerDied","Data":"237ab68c88e2ef274f74cb3d0be5842e5c803d3febcecf471da2af9b40a374ff"} Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.585418 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="237ab68c88e2ef274f74cb3d0be5842e5c803d3febcecf471da2af9b40a374ff" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.585383 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.684866 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb"] Dec 08 09:31:38 crc kubenswrapper[4776]: E1208 09:31:38.685438 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54152b91-017c-44e9-8c79-f0cf0befb065" containerName="collect-profiles" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.685458 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="54152b91-017c-44e9-8c79-f0cf0befb065" containerName="collect-profiles" Dec 08 09:31:38 crc kubenswrapper[4776]: E1208 09:31:38.685509 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2304e249-86bc-4b0a-a222-e8c2ba39a0bb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.685516 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2304e249-86bc-4b0a-a222-e8c2ba39a0bb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.685734 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2304e249-86bc-4b0a-a222-e8c2ba39a0bb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.685749 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="54152b91-017c-44e9-8c79-f0cf0befb065" containerName="collect-profiles" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.686530 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.688563 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.689530 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.690574 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.700441 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.719025 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb"] Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.874843 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7af7dfaf-3db0-4c5d-b7fc-671893276afc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-75cmb\" (UID: \"7af7dfaf-3db0-4c5d-b7fc-671893276afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.874960 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7af7dfaf-3db0-4c5d-b7fc-671893276afc-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-75cmb\" (UID: \"7af7dfaf-3db0-4c5d-b7fc-671893276afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.875121 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhjz6\" (UniqueName: \"kubernetes.io/projected/7af7dfaf-3db0-4c5d-b7fc-671893276afc-kube-api-access-nhjz6\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-75cmb\" (UID: \"7af7dfaf-3db0-4c5d-b7fc-671893276afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.976841 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhjz6\" (UniqueName: \"kubernetes.io/projected/7af7dfaf-3db0-4c5d-b7fc-671893276afc-kube-api-access-nhjz6\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-75cmb\" (UID: \"7af7dfaf-3db0-4c5d-b7fc-671893276afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.976966 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7af7dfaf-3db0-4c5d-b7fc-671893276afc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-75cmb\" (UID: \"7af7dfaf-3db0-4c5d-b7fc-671893276afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.977025 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7af7dfaf-3db0-4c5d-b7fc-671893276afc-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-75cmb\" (UID: \"7af7dfaf-3db0-4c5d-b7fc-671893276afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.981125 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7af7dfaf-3db0-4c5d-b7fc-671893276afc-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-75cmb\" (UID: \"7af7dfaf-3db0-4c5d-b7fc-671893276afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.993874 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7af7dfaf-3db0-4c5d-b7fc-671893276afc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-75cmb\" (UID: \"7af7dfaf-3db0-4c5d-b7fc-671893276afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" Dec 08 09:31:38 crc kubenswrapper[4776]: I1208 09:31:38.997693 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhjz6\" (UniqueName: \"kubernetes.io/projected/7af7dfaf-3db0-4c5d-b7fc-671893276afc-kube-api-access-nhjz6\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-75cmb\" (UID: \"7af7dfaf-3db0-4c5d-b7fc-671893276afc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" Dec 08 09:31:39 crc kubenswrapper[4776]: I1208 09:31:39.009492 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" Dec 08 09:31:39 crc kubenswrapper[4776]: I1208 09:31:39.758557 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb"] Dec 08 09:31:40 crc kubenswrapper[4776]: I1208 09:31:40.621997 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" event={"ID":"7af7dfaf-3db0-4c5d-b7fc-671893276afc","Type":"ContainerStarted","Data":"5b66156565686ea50e62faec22cb2feaec07448e730e54635078b08c5d8f1f30"} Dec 08 09:31:40 crc kubenswrapper[4776]: I1208 09:31:40.622834 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" event={"ID":"7af7dfaf-3db0-4c5d-b7fc-671893276afc","Type":"ContainerStarted","Data":"faa0cc973044806982ae8365d0fa8b1ec7101e4091b51e7ed3307ffa2401674b"} Dec 08 09:31:40 crc kubenswrapper[4776]: I1208 09:31:40.644326 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" podStartSLOduration=2.195404241 podStartE2EDuration="2.644301529s" podCreationTimestamp="2025-12-08 09:31:38 +0000 UTC" firstStartedPulling="2025-12-08 09:31:39.786595412 +0000 UTC m=+1976.049820434" lastFinishedPulling="2025-12-08 09:31:40.23549269 +0000 UTC m=+1976.498717722" observedRunningTime="2025-12-08 09:31:40.638844322 +0000 UTC m=+1976.902069344" watchObservedRunningTime="2025-12-08 09:31:40.644301529 +0000 UTC m=+1976.907526571" Dec 08 09:31:54 crc kubenswrapper[4776]: I1208 09:31:54.046115 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-3e4f-account-create-update-27jsm"] Dec 08 09:31:54 crc kubenswrapper[4776]: I1208 09:31:54.061050 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-rfbws"] Dec 08 09:31:54 crc kubenswrapper[4776]: I1208 09:31:54.072521 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-3e4f-account-create-update-27jsm"] Dec 08 09:31:54 crc kubenswrapper[4776]: I1208 09:31:54.084349 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-rfbws"] Dec 08 09:31:54 crc kubenswrapper[4776]: I1208 09:31:54.360921 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="285be34f-c8bd-45c4-9e7c-da900aaa3fd2" path="/var/lib/kubelet/pods/285be34f-c8bd-45c4-9e7c-da900aaa3fd2/volumes" Dec 08 09:31:54 crc kubenswrapper[4776]: I1208 09:31:54.363630 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81ff1f5b-522f-4e63-84b6-2462e19419e7" path="/var/lib/kubelet/pods/81ff1f5b-522f-4e63-84b6-2462e19419e7/volumes" Dec 08 09:32:04 crc kubenswrapper[4776]: I1208 09:32:04.076760 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp"] Dec 08 09:32:04 crc kubenswrapper[4776]: I1208 09:32:04.093122 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-ww8jk"] Dec 08 09:32:04 crc kubenswrapper[4776]: I1208 09:32:04.106698 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3646-account-create-update-9fgcf"] Dec 08 09:32:04 crc kubenswrapper[4776]: I1208 09:32:04.118025 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-xlh2r"] Dec 08 09:32:04 crc kubenswrapper[4776]: I1208 09:32:04.129814 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-ps8vp"] Dec 08 09:32:04 crc kubenswrapper[4776]: I1208 09:32:04.141911 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-827c-account-create-update-2zj8b"] Dec 08 09:32:04 crc kubenswrapper[4776]: I1208 09:32:04.152816 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3646-account-create-update-9fgcf"] Dec 08 09:32:04 crc kubenswrapper[4776]: I1208 09:32:04.164083 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-ww8jk"] Dec 08 09:32:04 crc kubenswrapper[4776]: I1208 09:32:04.174966 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-827c-account-create-update-2zj8b"] Dec 08 09:32:04 crc kubenswrapper[4776]: I1208 09:32:04.189620 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-xlh2r"] Dec 08 09:32:04 crc kubenswrapper[4776]: I1208 09:32:04.360498 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0920722c-0e43-40c4-8c74-63d6bbb9b419" path="/var/lib/kubelet/pods/0920722c-0e43-40c4-8c74-63d6bbb9b419/volumes" Dec 08 09:32:04 crc kubenswrapper[4776]: I1208 09:32:04.362513 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="189aea5a-3eb2-41b7-9431-21a0acf13db7" path="/var/lib/kubelet/pods/189aea5a-3eb2-41b7-9431-21a0acf13db7/volumes" Dec 08 09:32:04 crc kubenswrapper[4776]: I1208 09:32:04.364836 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c45c98e-e12c-404f-8684-6d34481e2cee" path="/var/lib/kubelet/pods/5c45c98e-e12c-404f-8684-6d34481e2cee/volumes" Dec 08 09:32:04 crc kubenswrapper[4776]: I1208 09:32:04.366654 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf" path="/var/lib/kubelet/pods/de62cc1f-e51f-4ae9-9de7-df5d8cb27cdf/volumes" Dec 08 09:32:04 crc kubenswrapper[4776]: I1208 09:32:04.369308 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746" path="/var/lib/kubelet/pods/e8b6a12b-b8f7-4754-9bc0-5f0ba81fe746/volumes" Dec 08 09:32:05 crc kubenswrapper[4776]: I1208 09:32:05.032258 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7b88-account-create-update-kxcml"] Dec 08 09:32:05 crc kubenswrapper[4776]: I1208 09:32:05.045609 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7b88-account-create-update-kxcml"] Dec 08 09:32:06 crc kubenswrapper[4776]: I1208 09:32:06.359736 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76ca76af-146b-4bb6-b676-1f9c8fd7f512" path="/var/lib/kubelet/pods/76ca76af-146b-4bb6-b676-1f9c8fd7f512/volumes" Dec 08 09:32:13 crc kubenswrapper[4776]: I1208 09:32:13.046917 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-xm2wh"] Dec 08 09:32:13 crc kubenswrapper[4776]: I1208 09:32:13.062348 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d26a-account-create-update-wr6lj"] Dec 08 09:32:13 crc kubenswrapper[4776]: I1208 09:32:13.073859 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-xm2wh"] Dec 08 09:32:13 crc kubenswrapper[4776]: I1208 09:32:13.087045 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d26a-account-create-update-wr6lj"] Dec 08 09:32:14 crc kubenswrapper[4776]: I1208 09:32:14.383357 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a94fb6fb-5b7e-4034-a8e4-9c40e269409e" path="/var/lib/kubelet/pods/a94fb6fb-5b7e-4034-a8e4-9c40e269409e/volumes" Dec 08 09:32:14 crc kubenswrapper[4776]: I1208 09:32:14.386429 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0571661-99e6-43e0-b2ea-1924e5437a7f" path="/var/lib/kubelet/pods/e0571661-99e6-43e0-b2ea-1924e5437a7f/volumes" Dec 08 09:32:15 crc kubenswrapper[4776]: I1208 09:32:15.071683 4776 scope.go:117] "RemoveContainer" containerID="215c94b1341337698ac9fc9c0e9326d513646925688a984dd263fd1fb1f219e0" Dec 08 09:32:15 crc kubenswrapper[4776]: I1208 09:32:15.099478 4776 scope.go:117] "RemoveContainer" containerID="90a74cf7f85524f89d537da4e7d9238ea9b3c5ee80872b88825143f5ca3c333c" Dec 08 09:32:15 crc kubenswrapper[4776]: I1208 09:32:15.188644 4776 scope.go:117] "RemoveContainer" containerID="154c3c886ca8ad01f326fa76abb60ed578bd04ef518c989b81fa61e986454cee" Dec 08 09:32:15 crc kubenswrapper[4776]: I1208 09:32:15.242637 4776 scope.go:117] "RemoveContainer" containerID="79235d51ab147737e701d48cff1bddb761b498511a609c2fdd70ef52f335b620" Dec 08 09:32:15 crc kubenswrapper[4776]: I1208 09:32:15.298716 4776 scope.go:117] "RemoveContainer" containerID="dc78358dc9ace872fae809d857c55d4d957daddc58146d5b31fe46d84b67bf5c" Dec 08 09:32:15 crc kubenswrapper[4776]: I1208 09:32:15.354302 4776 scope.go:117] "RemoveContainer" containerID="b7b95498f53d7901e339f9c83deba1b8e748ae9b78778385c70e68e7ac9e203b" Dec 08 09:32:15 crc kubenswrapper[4776]: I1208 09:32:15.400501 4776 scope.go:117] "RemoveContainer" containerID="418601c2d5742bbf0373b37729b1a080a517f3ed1ef2a0d4bfcbd4ec1ad1d0c8" Dec 08 09:32:15 crc kubenswrapper[4776]: I1208 09:32:15.426127 4776 scope.go:117] "RemoveContainer" containerID="7436f875e146e77ea0960ce63ed08ec6c5456dcab6a8bfc87d62e0c1da12f314" Dec 08 09:32:15 crc kubenswrapper[4776]: I1208 09:32:15.448895 4776 scope.go:117] "RemoveContainer" containerID="6ae3cc48666cc8b589590c920969cd7ed45e832b0fd68e32e257c45dac047dc8" Dec 08 09:32:15 crc kubenswrapper[4776]: I1208 09:32:15.475700 4776 scope.go:117] "RemoveContainer" containerID="3bffbe9698f2e2ccf2cad57c49c83041fd2508e934573699704b5affd85f6027" Dec 08 09:32:15 crc kubenswrapper[4776]: I1208 09:32:15.508212 4776 scope.go:117] "RemoveContainer" containerID="6f68a56f22b1c104f29ecde6f5bfdc13dba2bb4af424c0415f68b59480d396e3" Dec 08 09:32:17 crc kubenswrapper[4776]: I1208 09:32:17.745282 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wkhft"] Dec 08 09:32:17 crc kubenswrapper[4776]: I1208 09:32:17.749576 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 09:32:17 crc kubenswrapper[4776]: I1208 09:32:17.762846 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wkhft"] Dec 08 09:32:17 crc kubenswrapper[4776]: I1208 09:32:17.834650 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqvs2\" (UniqueName: \"kubernetes.io/projected/65dfa143-cdae-4009-9f9d-ec37dec2711a-kube-api-access-sqvs2\") pod \"redhat-operators-wkhft\" (UID: \"65dfa143-cdae-4009-9f9d-ec37dec2711a\") " pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 09:32:17 crc kubenswrapper[4776]: I1208 09:32:17.834808 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dfa143-cdae-4009-9f9d-ec37dec2711a-utilities\") pod \"redhat-operators-wkhft\" (UID: \"65dfa143-cdae-4009-9f9d-ec37dec2711a\") " pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 09:32:17 crc kubenswrapper[4776]: I1208 09:32:17.835150 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dfa143-cdae-4009-9f9d-ec37dec2711a-catalog-content\") pod \"redhat-operators-wkhft\" (UID: \"65dfa143-cdae-4009-9f9d-ec37dec2711a\") " pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 09:32:17 crc kubenswrapper[4776]: I1208 09:32:17.937983 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dfa143-cdae-4009-9f9d-ec37dec2711a-catalog-content\") pod \"redhat-operators-wkhft\" (UID: \"65dfa143-cdae-4009-9f9d-ec37dec2711a\") " pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 09:32:17 crc kubenswrapper[4776]: I1208 09:32:17.938050 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqvs2\" (UniqueName: \"kubernetes.io/projected/65dfa143-cdae-4009-9f9d-ec37dec2711a-kube-api-access-sqvs2\") pod \"redhat-operators-wkhft\" (UID: \"65dfa143-cdae-4009-9f9d-ec37dec2711a\") " pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 09:32:17 crc kubenswrapper[4776]: I1208 09:32:17.938123 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dfa143-cdae-4009-9f9d-ec37dec2711a-utilities\") pod \"redhat-operators-wkhft\" (UID: \"65dfa143-cdae-4009-9f9d-ec37dec2711a\") " pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 09:32:17 crc kubenswrapper[4776]: I1208 09:32:17.938592 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dfa143-cdae-4009-9f9d-ec37dec2711a-utilities\") pod \"redhat-operators-wkhft\" (UID: \"65dfa143-cdae-4009-9f9d-ec37dec2711a\") " pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 09:32:17 crc kubenswrapper[4776]: I1208 09:32:17.939884 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dfa143-cdae-4009-9f9d-ec37dec2711a-catalog-content\") pod \"redhat-operators-wkhft\" (UID: \"65dfa143-cdae-4009-9f9d-ec37dec2711a\") " pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 09:32:17 crc kubenswrapper[4776]: I1208 09:32:17.962573 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqvs2\" (UniqueName: \"kubernetes.io/projected/65dfa143-cdae-4009-9f9d-ec37dec2711a-kube-api-access-sqvs2\") pod \"redhat-operators-wkhft\" (UID: \"65dfa143-cdae-4009-9f9d-ec37dec2711a\") " pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 09:32:18 crc kubenswrapper[4776]: I1208 09:32:18.077963 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 09:32:18 crc kubenswrapper[4776]: I1208 09:32:18.622853 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wkhft"] Dec 08 09:32:19 crc kubenswrapper[4776]: I1208 09:32:19.104048 4776 generic.go:334] "Generic (PLEG): container finished" podID="65dfa143-cdae-4009-9f9d-ec37dec2711a" containerID="3142d94fd6e0c2cc0ce5b0d356886b77b149691ea585f9bf9b472f80238bab1e" exitCode=0 Dec 08 09:32:19 crc kubenswrapper[4776]: I1208 09:32:19.104090 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkhft" event={"ID":"65dfa143-cdae-4009-9f9d-ec37dec2711a","Type":"ContainerDied","Data":"3142d94fd6e0c2cc0ce5b0d356886b77b149691ea585f9bf9b472f80238bab1e"} Dec 08 09:32:19 crc kubenswrapper[4776]: I1208 09:32:19.104115 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkhft" event={"ID":"65dfa143-cdae-4009-9f9d-ec37dec2711a","Type":"ContainerStarted","Data":"b6b23dc6168fd44c5a6c975ff992f248251b51845b2674994a462cdb0dfa49a2"} Dec 08 09:32:19 crc kubenswrapper[4776]: I1208 09:32:19.106815 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 09:32:22 crc kubenswrapper[4776]: I1208 09:32:22.050502 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-wk8pc"] Dec 08 09:32:22 crc kubenswrapper[4776]: I1208 09:32:22.061658 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-wk8pc"] Dec 08 09:32:22 crc kubenswrapper[4776]: I1208 09:32:22.360503 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dbd4182-75a0-42dd-97c8-a1cb8fee96f2" path="/var/lib/kubelet/pods/3dbd4182-75a0-42dd-97c8-a1cb8fee96f2/volumes" Dec 08 09:32:23 crc kubenswrapper[4776]: I1208 09:32:23.038154 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f2b3-account-create-update-7jtlp"] Dec 08 09:32:23 crc kubenswrapper[4776]: I1208 09:32:23.055593 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-gjfw9"] Dec 08 09:32:23 crc kubenswrapper[4776]: I1208 09:32:23.067615 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-hxwvw"] Dec 08 09:32:23 crc kubenswrapper[4776]: I1208 09:32:23.081664 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-gjfw9"] Dec 08 09:32:23 crc kubenswrapper[4776]: I1208 09:32:23.092530 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-hxwvw"] Dec 08 09:32:23 crc kubenswrapper[4776]: I1208 09:32:23.102616 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-a752-account-create-update-c65jn"] Dec 08 09:32:23 crc kubenswrapper[4776]: I1208 09:32:23.115092 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f2b3-account-create-update-7jtlp"] Dec 08 09:32:23 crc kubenswrapper[4776]: I1208 09:32:23.128015 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-a752-account-create-update-c65jn"] Dec 08 09:32:24 crc kubenswrapper[4776]: I1208 09:32:24.060124 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-732a-account-create-update-mm924"] Dec 08 09:32:24 crc kubenswrapper[4776]: I1208 09:32:24.072647 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b17b-account-create-update-2sxng"] Dec 08 09:32:24 crc kubenswrapper[4776]: I1208 09:32:24.085787 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-wxkbx"] Dec 08 09:32:24 crc kubenswrapper[4776]: I1208 09:32:24.097823 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-732a-account-create-update-mm924"] Dec 08 09:32:24 crc kubenswrapper[4776]: I1208 09:32:24.111930 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b17b-account-create-update-2sxng"] Dec 08 09:32:24 crc kubenswrapper[4776]: I1208 09:32:24.123669 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-wxkbx"] Dec 08 09:32:24 crc kubenswrapper[4776]: I1208 09:32:24.365551 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4811e2fe-5855-47ee-b742-ec6c481936a2" path="/var/lib/kubelet/pods/4811e2fe-5855-47ee-b742-ec6c481936a2/volumes" Dec 08 09:32:24 crc kubenswrapper[4776]: I1208 09:32:24.366946 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54af9994-75b9-457a-8b67-5687e91d698a" path="/var/lib/kubelet/pods/54af9994-75b9-457a-8b67-5687e91d698a/volumes" Dec 08 09:32:24 crc kubenswrapper[4776]: I1208 09:32:24.368040 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6677467f-5abc-4914-949d-bd6541aadeef" path="/var/lib/kubelet/pods/6677467f-5abc-4914-949d-bd6541aadeef/volumes" Dec 08 09:32:24 crc kubenswrapper[4776]: I1208 09:32:24.369589 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ffd8d26-fc44-453f-ad6b-9bce2b83252e" path="/var/lib/kubelet/pods/7ffd8d26-fc44-453f-ad6b-9bce2b83252e/volumes" Dec 08 09:32:24 crc kubenswrapper[4776]: I1208 09:32:24.372373 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a862b37a-f96b-495a-8d8e-b3640d2f0609" path="/var/lib/kubelet/pods/a862b37a-f96b-495a-8d8e-b3640d2f0609/volumes" Dec 08 09:32:24 crc kubenswrapper[4776]: I1208 09:32:24.373901 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd8cf615-20fc-42d9-bb77-cdeebbfcdb64" path="/var/lib/kubelet/pods/dd8cf615-20fc-42d9-bb77-cdeebbfcdb64/volumes" Dec 08 09:32:24 crc kubenswrapper[4776]: I1208 09:32:24.374830 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de008089-a913-4a62-85e1-f0ec597514ab" path="/var/lib/kubelet/pods/de008089-a913-4a62-85e1-f0ec597514ab/volumes" Dec 08 09:32:29 crc kubenswrapper[4776]: I1208 09:32:29.246546 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkhft" event={"ID":"65dfa143-cdae-4009-9f9d-ec37dec2711a","Type":"ContainerStarted","Data":"e4f0fd5b4d4b13c28b71913919420afc99b2725f3ede74f5a894fce9374686e2"} Dec 08 09:32:31 crc kubenswrapper[4776]: I1208 09:32:31.282446 4776 generic.go:334] "Generic (PLEG): container finished" podID="65dfa143-cdae-4009-9f9d-ec37dec2711a" containerID="e4f0fd5b4d4b13c28b71913919420afc99b2725f3ede74f5a894fce9374686e2" exitCode=0 Dec 08 09:32:31 crc kubenswrapper[4776]: I1208 09:32:31.282493 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkhft" event={"ID":"65dfa143-cdae-4009-9f9d-ec37dec2711a","Type":"ContainerDied","Data":"e4f0fd5b4d4b13c28b71913919420afc99b2725f3ede74f5a894fce9374686e2"} Dec 08 09:32:32 crc kubenswrapper[4776]: I1208 09:32:32.299377 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkhft" event={"ID":"65dfa143-cdae-4009-9f9d-ec37dec2711a","Type":"ContainerStarted","Data":"cf5a18071920634401ef02e95baaa683aa47c59f4431bb34853d75df96ae35b2"} Dec 08 09:32:32 crc kubenswrapper[4776]: I1208 09:32:32.332508 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wkhft" podStartSLOduration=2.511784559 podStartE2EDuration="15.332488832s" podCreationTimestamp="2025-12-08 09:32:17 +0000 UTC" firstStartedPulling="2025-12-08 09:32:19.106422402 +0000 UTC m=+2015.369647424" lastFinishedPulling="2025-12-08 09:32:31.927126655 +0000 UTC m=+2028.190351697" observedRunningTime="2025-12-08 09:32:32.327560689 +0000 UTC m=+2028.590785721" watchObservedRunningTime="2025-12-08 09:32:32.332488832 +0000 UTC m=+2028.595713854" Dec 08 09:32:36 crc kubenswrapper[4776]: I1208 09:32:36.029389 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-tz8ks"] Dec 08 09:32:36 crc kubenswrapper[4776]: I1208 09:32:36.043222 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-tz8ks"] Dec 08 09:32:36 crc kubenswrapper[4776]: I1208 09:32:36.367394 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d1decbe-db5e-4910-9604-aca62ec47099" path="/var/lib/kubelet/pods/1d1decbe-db5e-4910-9604-aca62ec47099/volumes" Dec 08 09:32:38 crc kubenswrapper[4776]: I1208 09:32:38.078400 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 09:32:38 crc kubenswrapper[4776]: I1208 09:32:38.078656 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 09:32:39 crc kubenswrapper[4776]: I1208 09:32:39.142730 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wkhft" podUID="65dfa143-cdae-4009-9f9d-ec37dec2711a" containerName="registry-server" probeResult="failure" output=< Dec 08 09:32:39 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 09:32:39 crc kubenswrapper[4776]: > Dec 08 09:32:41 crc kubenswrapper[4776]: I1208 09:32:41.399321 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:32:41 crc kubenswrapper[4776]: I1208 09:32:41.400946 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:32:48 crc kubenswrapper[4776]: I1208 09:32:48.137437 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 09:32:48 crc kubenswrapper[4776]: I1208 09:32:48.207109 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 09:32:48 crc kubenswrapper[4776]: I1208 09:32:48.769497 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wkhft"] Dec 08 09:32:48 crc kubenswrapper[4776]: I1208 09:32:48.949560 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c4zt9"] Dec 08 09:32:48 crc kubenswrapper[4776]: I1208 09:32:48.949840 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c4zt9" podUID="30120f8b-a384-40f4-9211-ba3d8b3154f0" containerName="registry-server" containerID="cri-o://c072bb9ea89a1695150db0de7b3edd5f8bea6c8418fceda2d002495cdc036101" gracePeriod=2 Dec 08 09:32:49 crc kubenswrapper[4776]: I1208 09:32:49.486358 4776 generic.go:334] "Generic (PLEG): container finished" podID="30120f8b-a384-40f4-9211-ba3d8b3154f0" containerID="c072bb9ea89a1695150db0de7b3edd5f8bea6c8418fceda2d002495cdc036101" exitCode=0 Dec 08 09:32:49 crc kubenswrapper[4776]: I1208 09:32:49.486437 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4zt9" event={"ID":"30120f8b-a384-40f4-9211-ba3d8b3154f0","Type":"ContainerDied","Data":"c072bb9ea89a1695150db0de7b3edd5f8bea6c8418fceda2d002495cdc036101"} Dec 08 09:32:49 crc kubenswrapper[4776]: I1208 09:32:49.486932 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4zt9" event={"ID":"30120f8b-a384-40f4-9211-ba3d8b3154f0","Type":"ContainerDied","Data":"44cdef27d42388ce65767f113bc5ed355de2283b2ed98299f78df76829d54677"} Dec 08 09:32:49 crc kubenswrapper[4776]: I1208 09:32:49.486960 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44cdef27d42388ce65767f113bc5ed355de2283b2ed98299f78df76829d54677" Dec 08 09:32:49 crc kubenswrapper[4776]: I1208 09:32:49.576147 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:32:49 crc kubenswrapper[4776]: I1208 09:32:49.698729 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30120f8b-a384-40f4-9211-ba3d8b3154f0-utilities\") pod \"30120f8b-a384-40f4-9211-ba3d8b3154f0\" (UID: \"30120f8b-a384-40f4-9211-ba3d8b3154f0\") " Dec 08 09:32:49 crc kubenswrapper[4776]: I1208 09:32:49.698899 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx9dt\" (UniqueName: \"kubernetes.io/projected/30120f8b-a384-40f4-9211-ba3d8b3154f0-kube-api-access-xx9dt\") pod \"30120f8b-a384-40f4-9211-ba3d8b3154f0\" (UID: \"30120f8b-a384-40f4-9211-ba3d8b3154f0\") " Dec 08 09:32:49 crc kubenswrapper[4776]: I1208 09:32:49.698988 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30120f8b-a384-40f4-9211-ba3d8b3154f0-catalog-content\") pod \"30120f8b-a384-40f4-9211-ba3d8b3154f0\" (UID: \"30120f8b-a384-40f4-9211-ba3d8b3154f0\") " Dec 08 09:32:49 crc kubenswrapper[4776]: I1208 09:32:49.748645 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30120f8b-a384-40f4-9211-ba3d8b3154f0-utilities" (OuterVolumeSpecName: "utilities") pod "30120f8b-a384-40f4-9211-ba3d8b3154f0" (UID: "30120f8b-a384-40f4-9211-ba3d8b3154f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:32:49 crc kubenswrapper[4776]: I1208 09:32:49.752323 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30120f8b-a384-40f4-9211-ba3d8b3154f0-kube-api-access-xx9dt" (OuterVolumeSpecName: "kube-api-access-xx9dt") pod "30120f8b-a384-40f4-9211-ba3d8b3154f0" (UID: "30120f8b-a384-40f4-9211-ba3d8b3154f0"). InnerVolumeSpecName "kube-api-access-xx9dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:32:49 crc kubenswrapper[4776]: I1208 09:32:49.800976 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30120f8b-a384-40f4-9211-ba3d8b3154f0-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:32:49 crc kubenswrapper[4776]: I1208 09:32:49.801019 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx9dt\" (UniqueName: \"kubernetes.io/projected/30120f8b-a384-40f4-9211-ba3d8b3154f0-kube-api-access-xx9dt\") on node \"crc\" DevicePath \"\"" Dec 08 09:32:49 crc kubenswrapper[4776]: I1208 09:32:49.965370 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30120f8b-a384-40f4-9211-ba3d8b3154f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30120f8b-a384-40f4-9211-ba3d8b3154f0" (UID: "30120f8b-a384-40f4-9211-ba3d8b3154f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:32:50 crc kubenswrapper[4776]: I1208 09:32:50.004604 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30120f8b-a384-40f4-9211-ba3d8b3154f0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:32:50 crc kubenswrapper[4776]: I1208 09:32:50.496046 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4zt9" Dec 08 09:32:50 crc kubenswrapper[4776]: I1208 09:32:50.527513 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c4zt9"] Dec 08 09:32:50 crc kubenswrapper[4776]: I1208 09:32:50.537550 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c4zt9"] Dec 08 09:32:52 crc kubenswrapper[4776]: I1208 09:32:52.361921 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30120f8b-a384-40f4-9211-ba3d8b3154f0" path="/var/lib/kubelet/pods/30120f8b-a384-40f4-9211-ba3d8b3154f0/volumes" Dec 08 09:33:08 crc kubenswrapper[4776]: I1208 09:33:08.046711 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-9hw46"] Dec 08 09:33:08 crc kubenswrapper[4776]: I1208 09:33:08.060139 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-9hw46"] Dec 08 09:33:08 crc kubenswrapper[4776]: I1208 09:33:08.356937 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a920788-f8d6-4c42-84f6-d842d9bf9a17" path="/var/lib/kubelet/pods/4a920788-f8d6-4c42-84f6-d842d9bf9a17/volumes" Dec 08 09:33:09 crc kubenswrapper[4776]: I1208 09:33:09.030399 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-8tjn6"] Dec 08 09:33:09 crc kubenswrapper[4776]: I1208 09:33:09.040357 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-8tjn6"] Dec 08 09:33:10 crc kubenswrapper[4776]: I1208 09:33:10.360191 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9d1e39a-4040-4f14-819f-f41e85a35143" path="/var/lib/kubelet/pods/f9d1e39a-4040-4f14-819f-f41e85a35143/volumes" Dec 08 09:33:11 crc kubenswrapper[4776]: I1208 09:33:11.398984 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:33:11 crc kubenswrapper[4776]: I1208 09:33:11.399341 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:33:15 crc kubenswrapper[4776]: I1208 09:33:15.743041 4776 scope.go:117] "RemoveContainer" containerID="8c65a5eb0be9c3f321f7345766a15b2e8c3efebf6526552db0b83edf93856a8b" Dec 08 09:33:15 crc kubenswrapper[4776]: I1208 09:33:15.787949 4776 scope.go:117] "RemoveContainer" containerID="ed082df2e4827cb316173d4dc8d75bf7d652cf36cf29300dbeee48c9a9899bdc" Dec 08 09:33:15 crc kubenswrapper[4776]: I1208 09:33:15.922239 4776 scope.go:117] "RemoveContainer" containerID="cc9a7cc6e55b19ccf3c2d1dc9540054478bcf15926f146c619b2f7febda98012" Dec 08 09:33:15 crc kubenswrapper[4776]: I1208 09:33:15.943645 4776 scope.go:117] "RemoveContainer" containerID="945643d714eb1ac8ac03804db9a4c0f27171a18f9c47fd32e202391e0f5fee43" Dec 08 09:33:15 crc kubenswrapper[4776]: I1208 09:33:15.996455 4776 scope.go:117] "RemoveContainer" containerID="ca7a98e4014a1bb8047ff9843f80b3e17c9f36c0a0a3326dcb9f034987c8ab25" Dec 08 09:33:16 crc kubenswrapper[4776]: I1208 09:33:16.052131 4776 scope.go:117] "RemoveContainer" containerID="ca6d12594b8c2f492e325734dda23a6c1d318ec30ab337e92d5788e763e8d5c8" Dec 08 09:33:16 crc kubenswrapper[4776]: I1208 09:33:16.109152 4776 scope.go:117] "RemoveContainer" containerID="aae7f6b2532aadc449b85d4c38c7b97e85329898aeab7778a386cfdb502c04ab" Dec 08 09:33:16 crc kubenswrapper[4776]: I1208 09:33:16.136870 4776 scope.go:117] "RemoveContainer" containerID="4f24f8968ed17fa81f6f8869195d3cc6621d449566b91c6fe0cb95b1375dcc9d" Dec 08 09:33:16 crc kubenswrapper[4776]: I1208 09:33:16.163729 4776 scope.go:117] "RemoveContainer" containerID="8559708363f99af1440c9831e86bb5db9591cbe956ef5e50280734a830e99d11" Dec 08 09:33:16 crc kubenswrapper[4776]: I1208 09:33:16.184526 4776 scope.go:117] "RemoveContainer" containerID="c072bb9ea89a1695150db0de7b3edd5f8bea6c8418fceda2d002495cdc036101" Dec 08 09:33:16 crc kubenswrapper[4776]: I1208 09:33:16.208130 4776 scope.go:117] "RemoveContainer" containerID="3db8d71ba1e6f8c4cf863e874a05573315ee07ed461d095d5ea9928a00e6b73e" Dec 08 09:33:16 crc kubenswrapper[4776]: I1208 09:33:16.242131 4776 scope.go:117] "RemoveContainer" containerID="bb58957decf1944799406833c849e43b523968c3344c72a1efe230f6ed07b9b5" Dec 08 09:33:16 crc kubenswrapper[4776]: I1208 09:33:16.282601 4776 scope.go:117] "RemoveContainer" containerID="6912c00417baf631d68734ffcb5f0478237df070eac8d6f5935f9907788219b2" Dec 08 09:33:16 crc kubenswrapper[4776]: I1208 09:33:16.312476 4776 scope.go:117] "RemoveContainer" containerID="961e427d177fd18f93264df8ac77d85ab8e45051a0265c2a83716c5a6d5bcd60" Dec 08 09:33:17 crc kubenswrapper[4776]: I1208 09:33:17.045548 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ns7rc"] Dec 08 09:33:17 crc kubenswrapper[4776]: I1208 09:33:17.077261 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ns7rc"] Dec 08 09:33:18 crc kubenswrapper[4776]: I1208 09:33:18.358625 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="913d2881-9323-4503-b364-05de889fd095" path="/var/lib/kubelet/pods/913d2881-9323-4503-b364-05de889fd095/volumes" Dec 08 09:33:25 crc kubenswrapper[4776]: I1208 09:33:25.036936 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-xksjb"] Dec 08 09:33:25 crc kubenswrapper[4776]: I1208 09:33:25.050879 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-xksjb"] Dec 08 09:33:26 crc kubenswrapper[4776]: I1208 09:33:26.372351 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d7c64ff-eec0-48d3-bba8-724158787096" path="/var/lib/kubelet/pods/6d7c64ff-eec0-48d3-bba8-724158787096/volumes" Dec 08 09:33:29 crc kubenswrapper[4776]: I1208 09:33:29.028195 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2s7n9"] Dec 08 09:33:29 crc kubenswrapper[4776]: I1208 09:33:29.037884 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-626nj"] Dec 08 09:33:29 crc kubenswrapper[4776]: I1208 09:33:29.048511 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2s7n9"] Dec 08 09:33:29 crc kubenswrapper[4776]: I1208 09:33:29.058459 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-626nj"] Dec 08 09:33:30 crc kubenswrapper[4776]: I1208 09:33:30.357352 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c962dc3-3c64-4b5d-a740-a790a5fa10f9" path="/var/lib/kubelet/pods/7c962dc3-3c64-4b5d-a740-a790a5fa10f9/volumes" Dec 08 09:33:30 crc kubenswrapper[4776]: I1208 09:33:30.359546 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dff1e28-5d80-48af-b348-cfd6080d3e37" path="/var/lib/kubelet/pods/9dff1e28-5d80-48af-b348-cfd6080d3e37/volumes" Dec 08 09:33:38 crc kubenswrapper[4776]: I1208 09:33:38.058490 4776 generic.go:334] "Generic (PLEG): container finished" podID="7af7dfaf-3db0-4c5d-b7fc-671893276afc" containerID="5b66156565686ea50e62faec22cb2feaec07448e730e54635078b08c5d8f1f30" exitCode=0 Dec 08 09:33:38 crc kubenswrapper[4776]: I1208 09:33:38.058586 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" event={"ID":"7af7dfaf-3db0-4c5d-b7fc-671893276afc","Type":"ContainerDied","Data":"5b66156565686ea50e62faec22cb2feaec07448e730e54635078b08c5d8f1f30"} Dec 08 09:33:39 crc kubenswrapper[4776]: I1208 09:33:39.636360 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" Dec 08 09:33:39 crc kubenswrapper[4776]: I1208 09:33:39.765342 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhjz6\" (UniqueName: \"kubernetes.io/projected/7af7dfaf-3db0-4c5d-b7fc-671893276afc-kube-api-access-nhjz6\") pod \"7af7dfaf-3db0-4c5d-b7fc-671893276afc\" (UID: \"7af7dfaf-3db0-4c5d-b7fc-671893276afc\") " Dec 08 09:33:39 crc kubenswrapper[4776]: I1208 09:33:39.765421 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7af7dfaf-3db0-4c5d-b7fc-671893276afc-ssh-key\") pod \"7af7dfaf-3db0-4c5d-b7fc-671893276afc\" (UID: \"7af7dfaf-3db0-4c5d-b7fc-671893276afc\") " Dec 08 09:33:39 crc kubenswrapper[4776]: I1208 09:33:39.765697 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7af7dfaf-3db0-4c5d-b7fc-671893276afc-inventory\") pod \"7af7dfaf-3db0-4c5d-b7fc-671893276afc\" (UID: \"7af7dfaf-3db0-4c5d-b7fc-671893276afc\") " Dec 08 09:33:39 crc kubenswrapper[4776]: I1208 09:33:39.777364 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7af7dfaf-3db0-4c5d-b7fc-671893276afc-kube-api-access-nhjz6" (OuterVolumeSpecName: "kube-api-access-nhjz6") pod "7af7dfaf-3db0-4c5d-b7fc-671893276afc" (UID: "7af7dfaf-3db0-4c5d-b7fc-671893276afc"). InnerVolumeSpecName "kube-api-access-nhjz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:33:39 crc kubenswrapper[4776]: I1208 09:33:39.804767 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7af7dfaf-3db0-4c5d-b7fc-671893276afc-inventory" (OuterVolumeSpecName: "inventory") pod "7af7dfaf-3db0-4c5d-b7fc-671893276afc" (UID: "7af7dfaf-3db0-4c5d-b7fc-671893276afc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:33:39 crc kubenswrapper[4776]: I1208 09:33:39.831705 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7af7dfaf-3db0-4c5d-b7fc-671893276afc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7af7dfaf-3db0-4c5d-b7fc-671893276afc" (UID: "7af7dfaf-3db0-4c5d-b7fc-671893276afc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:33:39 crc kubenswrapper[4776]: I1208 09:33:39.873788 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7af7dfaf-3db0-4c5d-b7fc-671893276afc-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:33:39 crc kubenswrapper[4776]: I1208 09:33:39.873825 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhjz6\" (UniqueName: \"kubernetes.io/projected/7af7dfaf-3db0-4c5d-b7fc-671893276afc-kube-api-access-nhjz6\") on node \"crc\" DevicePath \"\"" Dec 08 09:33:39 crc kubenswrapper[4776]: I1208 09:33:39.873839 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7af7dfaf-3db0-4c5d-b7fc-671893276afc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.085507 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" event={"ID":"7af7dfaf-3db0-4c5d-b7fc-671893276afc","Type":"ContainerDied","Data":"faa0cc973044806982ae8365d0fa8b1ec7101e4091b51e7ed3307ffa2401674b"} Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.085558 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faa0cc973044806982ae8365d0fa8b1ec7101e4091b51e7ed3307ffa2401674b" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.085624 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-75cmb" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.199113 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2"] Dec 08 09:33:40 crc kubenswrapper[4776]: E1208 09:33:40.199610 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30120f8b-a384-40f4-9211-ba3d8b3154f0" containerName="extract-content" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.199626 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="30120f8b-a384-40f4-9211-ba3d8b3154f0" containerName="extract-content" Dec 08 09:33:40 crc kubenswrapper[4776]: E1208 09:33:40.199637 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af7dfaf-3db0-4c5d-b7fc-671893276afc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.199645 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af7dfaf-3db0-4c5d-b7fc-671893276afc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 08 09:33:40 crc kubenswrapper[4776]: E1208 09:33:40.199677 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30120f8b-a384-40f4-9211-ba3d8b3154f0" containerName="registry-server" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.199683 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="30120f8b-a384-40f4-9211-ba3d8b3154f0" containerName="registry-server" Dec 08 09:33:40 crc kubenswrapper[4776]: E1208 09:33:40.199704 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30120f8b-a384-40f4-9211-ba3d8b3154f0" containerName="extract-utilities" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.199710 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="30120f8b-a384-40f4-9211-ba3d8b3154f0" containerName="extract-utilities" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.199921 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7af7dfaf-3db0-4c5d-b7fc-671893276afc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.199944 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="30120f8b-a384-40f4-9211-ba3d8b3154f0" containerName="registry-server" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.200730 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.204115 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.204626 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.204838 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.210022 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.218149 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2"] Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.281801 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htlhk\" (UniqueName: \"kubernetes.io/projected/6367602c-669d-474f-bd56-97c1b58659b4-kube-api-access-htlhk\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2\" (UID: \"6367602c-669d-474f-bd56-97c1b58659b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.281910 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6367602c-669d-474f-bd56-97c1b58659b4-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2\" (UID: \"6367602c-669d-474f-bd56-97c1b58659b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.281980 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6367602c-669d-474f-bd56-97c1b58659b4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2\" (UID: \"6367602c-669d-474f-bd56-97c1b58659b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.384017 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htlhk\" (UniqueName: \"kubernetes.io/projected/6367602c-669d-474f-bd56-97c1b58659b4-kube-api-access-htlhk\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2\" (UID: \"6367602c-669d-474f-bd56-97c1b58659b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.384485 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6367602c-669d-474f-bd56-97c1b58659b4-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2\" (UID: \"6367602c-669d-474f-bd56-97c1b58659b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.384679 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6367602c-669d-474f-bd56-97c1b58659b4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2\" (UID: \"6367602c-669d-474f-bd56-97c1b58659b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.398770 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6367602c-669d-474f-bd56-97c1b58659b4-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2\" (UID: \"6367602c-669d-474f-bd56-97c1b58659b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.400727 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6367602c-669d-474f-bd56-97c1b58659b4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2\" (UID: \"6367602c-669d-474f-bd56-97c1b58659b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.405475 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htlhk\" (UniqueName: \"kubernetes.io/projected/6367602c-669d-474f-bd56-97c1b58659b4-kube-api-access-htlhk\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2\" (UID: \"6367602c-669d-474f-bd56-97c1b58659b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" Dec 08 09:33:40 crc kubenswrapper[4776]: I1208 09:33:40.523616 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" Dec 08 09:33:41 crc kubenswrapper[4776]: I1208 09:33:41.169586 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2"] Dec 08 09:33:41 crc kubenswrapper[4776]: I1208 09:33:41.398943 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:33:41 crc kubenswrapper[4776]: I1208 09:33:41.399039 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:33:41 crc kubenswrapper[4776]: I1208 09:33:41.399123 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 09:33:41 crc kubenswrapper[4776]: I1208 09:33:41.400490 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d8d2a2902024cbe40e4f1f7b98f5ca2ce52445ed3e9c1d7d0e196722965b4ad"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 09:33:41 crc kubenswrapper[4776]: I1208 09:33:41.400608 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://7d8d2a2902024cbe40e4f1f7b98f5ca2ce52445ed3e9c1d7d0e196722965b4ad" gracePeriod=600 Dec 08 09:33:42 crc kubenswrapper[4776]: I1208 09:33:42.117537 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="7d8d2a2902024cbe40e4f1f7b98f5ca2ce52445ed3e9c1d7d0e196722965b4ad" exitCode=0 Dec 08 09:33:42 crc kubenswrapper[4776]: I1208 09:33:42.117604 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"7d8d2a2902024cbe40e4f1f7b98f5ca2ce52445ed3e9c1d7d0e196722965b4ad"} Dec 08 09:33:42 crc kubenswrapper[4776]: I1208 09:33:42.118206 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296"} Dec 08 09:33:42 crc kubenswrapper[4776]: I1208 09:33:42.118235 4776 scope.go:117] "RemoveContainer" containerID="bd00cf9685c68dc97ea68681aad0bfeeb2d56965a763cd3cffb3c7d3789a7341" Dec 08 09:33:42 crc kubenswrapper[4776]: I1208 09:33:42.121930 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" event={"ID":"6367602c-669d-474f-bd56-97c1b58659b4","Type":"ContainerStarted","Data":"f2cb760cdc26eba5b312b1ba4530848943cbf29dbd3a71b1457bd6f2bef37998"} Dec 08 09:33:43 crc kubenswrapper[4776]: I1208 09:33:43.138250 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" event={"ID":"6367602c-669d-474f-bd56-97c1b58659b4","Type":"ContainerStarted","Data":"18ceac23d51697e862c9df85c85aba0e95396cba027073e9ed5d2410201f63a8"} Dec 08 09:33:43 crc kubenswrapper[4776]: I1208 09:33:43.167066 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" podStartSLOduration=2.464853904 podStartE2EDuration="3.167043714s" podCreationTimestamp="2025-12-08 09:33:40 +0000 UTC" firstStartedPulling="2025-12-08 09:33:41.175230446 +0000 UTC m=+2097.438455468" lastFinishedPulling="2025-12-08 09:33:41.877420236 +0000 UTC m=+2098.140645278" observedRunningTime="2025-12-08 09:33:43.155601596 +0000 UTC m=+2099.418826618" watchObservedRunningTime="2025-12-08 09:33:43.167043714 +0000 UTC m=+2099.430268746" Dec 08 09:34:16 crc kubenswrapper[4776]: I1208 09:34:16.623883 4776 scope.go:117] "RemoveContainer" containerID="29fd9693fc3d83cc2e5ce55a3d87e9bcb1804c6861d8e11c13aa0af1ba9823d9" Dec 08 09:34:16 crc kubenswrapper[4776]: I1208 09:34:16.667627 4776 scope.go:117] "RemoveContainer" containerID="11d7debca6d75173ba761b687f25eb76e43517200472766b13c3590c38eeb450" Dec 08 09:34:16 crc kubenswrapper[4776]: I1208 09:34:16.704151 4776 scope.go:117] "RemoveContainer" containerID="c6fadcb238ee53d36512092bee0f72b706c31a0875b3785872e7aac1b82a72da" Dec 08 09:34:16 crc kubenswrapper[4776]: I1208 09:34:16.766735 4776 scope.go:117] "RemoveContainer" containerID="e1e837bebb9bb5b35dabc32043d3992098ffa114681c7c6d190559d35cb0ab70" Dec 08 09:34:17 crc kubenswrapper[4776]: I1208 09:34:17.053901 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-nngf6"] Dec 08 09:34:17 crc kubenswrapper[4776]: I1208 09:34:17.068417 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-nngf6"] Dec 08 09:34:18 crc kubenswrapper[4776]: I1208 09:34:18.034810 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-60a1-account-create-update-6j76b"] Dec 08 09:34:18 crc kubenswrapper[4776]: I1208 09:34:18.047080 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-jr6cc"] Dec 08 09:34:18 crc kubenswrapper[4776]: I1208 09:34:18.057378 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-98lsp"] Dec 08 09:34:18 crc kubenswrapper[4776]: I1208 09:34:18.067290 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-60a1-account-create-update-6j76b"] Dec 08 09:34:18 crc kubenswrapper[4776]: I1208 09:34:18.076980 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-jr6cc"] Dec 08 09:34:18 crc kubenswrapper[4776]: I1208 09:34:18.091523 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-98lsp"] Dec 08 09:34:18 crc kubenswrapper[4776]: I1208 09:34:18.356679 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="349ea677-2d9d-4506-b515-d5b03946ea88" path="/var/lib/kubelet/pods/349ea677-2d9d-4506-b515-d5b03946ea88/volumes" Dec 08 09:34:18 crc kubenswrapper[4776]: I1208 09:34:18.388300 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d642614-8b52-4d92-ae93-d281a37339df" path="/var/lib/kubelet/pods/5d642614-8b52-4d92-ae93-d281a37339df/volumes" Dec 08 09:34:18 crc kubenswrapper[4776]: I1208 09:34:18.389592 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="694530fd-2851-4a16-b642-071a7fca1ec0" path="/var/lib/kubelet/pods/694530fd-2851-4a16-b642-071a7fca1ec0/volumes" Dec 08 09:34:18 crc kubenswrapper[4776]: I1208 09:34:18.390341 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5bb00ac-bb46-44ce-b2b1-537573b86f6e" path="/var/lib/kubelet/pods/e5bb00ac-bb46-44ce-b2b1-537573b86f6e/volumes" Dec 08 09:34:19 crc kubenswrapper[4776]: I1208 09:34:19.041441 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-45e9-account-create-update-8lsk4"] Dec 08 09:34:19 crc kubenswrapper[4776]: I1208 09:34:19.053436 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-45e9-account-create-update-8lsk4"] Dec 08 09:34:20 crc kubenswrapper[4776]: I1208 09:34:20.026608 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1a23-account-create-update-cldmj"] Dec 08 09:34:20 crc kubenswrapper[4776]: I1208 09:34:20.035974 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1a23-account-create-update-cldmj"] Dec 08 09:34:20 crc kubenswrapper[4776]: I1208 09:34:20.363159 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6923b91-d6e2-4672-be62-2531342086e1" path="/var/lib/kubelet/pods/c6923b91-d6e2-4672-be62-2531342086e1/volumes" Dec 08 09:34:20 crc kubenswrapper[4776]: I1208 09:34:20.365076 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6727a0d-5792-4bf3-9d9b-a84ad470ba82" path="/var/lib/kubelet/pods/f6727a0d-5792-4bf3-9d9b-a84ad470ba82/volumes" Dec 08 09:34:43 crc kubenswrapper[4776]: I1208 09:34:43.652271 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7rjn4"] Dec 08 09:34:43 crc kubenswrapper[4776]: I1208 09:34:43.655417 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:34:43 crc kubenswrapper[4776]: I1208 09:34:43.676413 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rjn4"] Dec 08 09:34:43 crc kubenswrapper[4776]: I1208 09:34:43.699254 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06483596-1371-43c1-897a-5ff9af3639ad-catalog-content\") pod \"redhat-marketplace-7rjn4\" (UID: \"06483596-1371-43c1-897a-5ff9af3639ad\") " pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:34:43 crc kubenswrapper[4776]: I1208 09:34:43.699340 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2ldz\" (UniqueName: \"kubernetes.io/projected/06483596-1371-43c1-897a-5ff9af3639ad-kube-api-access-w2ldz\") pod \"redhat-marketplace-7rjn4\" (UID: \"06483596-1371-43c1-897a-5ff9af3639ad\") " pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:34:43 crc kubenswrapper[4776]: I1208 09:34:43.699535 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06483596-1371-43c1-897a-5ff9af3639ad-utilities\") pod \"redhat-marketplace-7rjn4\" (UID: \"06483596-1371-43c1-897a-5ff9af3639ad\") " pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:34:43 crc kubenswrapper[4776]: I1208 09:34:43.801781 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06483596-1371-43c1-897a-5ff9af3639ad-utilities\") pod \"redhat-marketplace-7rjn4\" (UID: \"06483596-1371-43c1-897a-5ff9af3639ad\") " pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:34:43 crc kubenswrapper[4776]: I1208 09:34:43.801896 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06483596-1371-43c1-897a-5ff9af3639ad-catalog-content\") pod \"redhat-marketplace-7rjn4\" (UID: \"06483596-1371-43c1-897a-5ff9af3639ad\") " pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:34:43 crc kubenswrapper[4776]: I1208 09:34:43.801931 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2ldz\" (UniqueName: \"kubernetes.io/projected/06483596-1371-43c1-897a-5ff9af3639ad-kube-api-access-w2ldz\") pod \"redhat-marketplace-7rjn4\" (UID: \"06483596-1371-43c1-897a-5ff9af3639ad\") " pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:34:43 crc kubenswrapper[4776]: I1208 09:34:43.802457 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06483596-1371-43c1-897a-5ff9af3639ad-catalog-content\") pod \"redhat-marketplace-7rjn4\" (UID: \"06483596-1371-43c1-897a-5ff9af3639ad\") " pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:34:43 crc kubenswrapper[4776]: I1208 09:34:43.802458 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06483596-1371-43c1-897a-5ff9af3639ad-utilities\") pod \"redhat-marketplace-7rjn4\" (UID: \"06483596-1371-43c1-897a-5ff9af3639ad\") " pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:34:43 crc kubenswrapper[4776]: I1208 09:34:43.821637 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2ldz\" (UniqueName: \"kubernetes.io/projected/06483596-1371-43c1-897a-5ff9af3639ad-kube-api-access-w2ldz\") pod \"redhat-marketplace-7rjn4\" (UID: \"06483596-1371-43c1-897a-5ff9af3639ad\") " pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:34:43 crc kubenswrapper[4776]: I1208 09:34:43.974783 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:34:44 crc kubenswrapper[4776]: I1208 09:34:44.567914 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rjn4"] Dec 08 09:34:44 crc kubenswrapper[4776]: I1208 09:34:44.921318 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rjn4" event={"ID":"06483596-1371-43c1-897a-5ff9af3639ad","Type":"ContainerStarted","Data":"d4e3bfed2b0a273166d35a46a3ff87755eb19acba5a226c38b7680864163af7b"} Dec 08 09:34:45 crc kubenswrapper[4776]: I1208 09:34:45.932742 4776 generic.go:334] "Generic (PLEG): container finished" podID="06483596-1371-43c1-897a-5ff9af3639ad" containerID="d673cd1c187f52dc35f3c426e8a51ae91ab889ab83df5c7f5504b9368ab90044" exitCode=0 Dec 08 09:34:45 crc kubenswrapper[4776]: I1208 09:34:45.932854 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rjn4" event={"ID":"06483596-1371-43c1-897a-5ff9af3639ad","Type":"ContainerDied","Data":"d673cd1c187f52dc35f3c426e8a51ae91ab889ab83df5c7f5504b9368ab90044"} Dec 08 09:34:50 crc kubenswrapper[4776]: I1208 09:34:50.040813 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zw8wj"] Dec 08 09:34:50 crc kubenswrapper[4776]: I1208 09:34:50.052933 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zw8wj"] Dec 08 09:34:50 crc kubenswrapper[4776]: I1208 09:34:50.360449 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2c03694-357a-4838-8202-7e3d3196f9ca" path="/var/lib/kubelet/pods/c2c03694-357a-4838-8202-7e3d3196f9ca/volumes" Dec 08 09:34:51 crc kubenswrapper[4776]: I1208 09:34:51.698832 4776 patch_prober.go:28] interesting pod/monitoring-plugin-787d456dd8-9svrh container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.77:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 08 09:34:51 crc kubenswrapper[4776]: I1208 09:34:51.699264 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-787d456dd8-9svrh" podUID="d10744d9-5819-4ca3-815c-5a8782037204" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.77:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 08 09:34:54 crc kubenswrapper[4776]: I1208 09:34:54.041189 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rjn4" event={"ID":"06483596-1371-43c1-897a-5ff9af3639ad","Type":"ContainerStarted","Data":"1107f85faa03e52fb1e2f2662285039b758f3487e1795fec3f28514935b89f26"} Dec 08 09:34:56 crc kubenswrapper[4776]: I1208 09:34:56.065167 4776 generic.go:334] "Generic (PLEG): container finished" podID="06483596-1371-43c1-897a-5ff9af3639ad" containerID="1107f85faa03e52fb1e2f2662285039b758f3487e1795fec3f28514935b89f26" exitCode=0 Dec 08 09:34:56 crc kubenswrapper[4776]: I1208 09:34:56.065291 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rjn4" event={"ID":"06483596-1371-43c1-897a-5ff9af3639ad","Type":"ContainerDied","Data":"1107f85faa03e52fb1e2f2662285039b758f3487e1795fec3f28514935b89f26"} Dec 08 09:35:02 crc kubenswrapper[4776]: I1208 09:35:02.126934 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rjn4" event={"ID":"06483596-1371-43c1-897a-5ff9af3639ad","Type":"ContainerStarted","Data":"22653f47f72fdd9b405d4c80e308eb15ed0af3575f571e91dfcae01c840a36f0"} Dec 08 09:35:02 crc kubenswrapper[4776]: I1208 09:35:02.152266 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7rjn4" podStartSLOduration=4.39115207 podStartE2EDuration="19.152247328s" podCreationTimestamp="2025-12-08 09:34:43 +0000 UTC" firstStartedPulling="2025-12-08 09:34:45.934974997 +0000 UTC m=+2162.198200009" lastFinishedPulling="2025-12-08 09:35:00.696070255 +0000 UTC m=+2176.959295267" observedRunningTime="2025-12-08 09:35:02.142829455 +0000 UTC m=+2178.406054477" watchObservedRunningTime="2025-12-08 09:35:02.152247328 +0000 UTC m=+2178.415472350" Dec 08 09:35:03 crc kubenswrapper[4776]: I1208 09:35:03.975819 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:35:03 crc kubenswrapper[4776]: I1208 09:35:03.976923 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:35:04 crc kubenswrapper[4776]: I1208 09:35:04.021355 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:35:05 crc kubenswrapper[4776]: I1208 09:35:05.036008 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-b033-account-create-update-dzl9x"] Dec 08 09:35:05 crc kubenswrapper[4776]: I1208 09:35:05.048609 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-b033-account-create-update-dzl9x"] Dec 08 09:35:05 crc kubenswrapper[4776]: I1208 09:35:05.057486 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-qcnxq"] Dec 08 09:35:05 crc kubenswrapper[4776]: I1208 09:35:05.067488 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-qcnxq"] Dec 08 09:35:06 crc kubenswrapper[4776]: I1208 09:35:06.360930 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0bfa985-0dfe-42bb-95ea-2e40830c7a23" path="/var/lib/kubelet/pods/b0bfa985-0dfe-42bb-95ea-2e40830c7a23/volumes" Dec 08 09:35:06 crc kubenswrapper[4776]: I1208 09:35:06.362937 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa674a56-b583-453e-9c66-e8ff93895b50" path="/var/lib/kubelet/pods/fa674a56-b583-453e-9c66-e8ff93895b50/volumes" Dec 08 09:35:14 crc kubenswrapper[4776]: I1208 09:35:14.026231 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:35:14 crc kubenswrapper[4776]: I1208 09:35:14.091896 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rjn4"] Dec 08 09:35:14 crc kubenswrapper[4776]: I1208 09:35:14.266920 4776 generic.go:334] "Generic (PLEG): container finished" podID="6367602c-669d-474f-bd56-97c1b58659b4" containerID="18ceac23d51697e862c9df85c85aba0e95396cba027073e9ed5d2410201f63a8" exitCode=0 Dec 08 09:35:14 crc kubenswrapper[4776]: I1208 09:35:14.267011 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" event={"ID":"6367602c-669d-474f-bd56-97c1b58659b4","Type":"ContainerDied","Data":"18ceac23d51697e862c9df85c85aba0e95396cba027073e9ed5d2410201f63a8"} Dec 08 09:35:14 crc kubenswrapper[4776]: I1208 09:35:14.267491 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7rjn4" podUID="06483596-1371-43c1-897a-5ff9af3639ad" containerName="registry-server" containerID="cri-o://22653f47f72fdd9b405d4c80e308eb15ed0af3575f571e91dfcae01c840a36f0" gracePeriod=2 Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.287343 4776 generic.go:334] "Generic (PLEG): container finished" podID="06483596-1371-43c1-897a-5ff9af3639ad" containerID="22653f47f72fdd9b405d4c80e308eb15ed0af3575f571e91dfcae01c840a36f0" exitCode=0 Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.287411 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rjn4" event={"ID":"06483596-1371-43c1-897a-5ff9af3639ad","Type":"ContainerDied","Data":"22653f47f72fdd9b405d4c80e308eb15ed0af3575f571e91dfcae01c840a36f0"} Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.456314 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.495825 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06483596-1371-43c1-897a-5ff9af3639ad-catalog-content\") pod \"06483596-1371-43c1-897a-5ff9af3639ad\" (UID: \"06483596-1371-43c1-897a-5ff9af3639ad\") " Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.496040 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06483596-1371-43c1-897a-5ff9af3639ad-utilities\") pod \"06483596-1371-43c1-897a-5ff9af3639ad\" (UID: \"06483596-1371-43c1-897a-5ff9af3639ad\") " Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.496066 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2ldz\" (UniqueName: \"kubernetes.io/projected/06483596-1371-43c1-897a-5ff9af3639ad-kube-api-access-w2ldz\") pod \"06483596-1371-43c1-897a-5ff9af3639ad\" (UID: \"06483596-1371-43c1-897a-5ff9af3639ad\") " Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.496539 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06483596-1371-43c1-897a-5ff9af3639ad-utilities" (OuterVolumeSpecName: "utilities") pod "06483596-1371-43c1-897a-5ff9af3639ad" (UID: "06483596-1371-43c1-897a-5ff9af3639ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.496889 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06483596-1371-43c1-897a-5ff9af3639ad-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.506467 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06483596-1371-43c1-897a-5ff9af3639ad-kube-api-access-w2ldz" (OuterVolumeSpecName: "kube-api-access-w2ldz") pod "06483596-1371-43c1-897a-5ff9af3639ad" (UID: "06483596-1371-43c1-897a-5ff9af3639ad"). InnerVolumeSpecName "kube-api-access-w2ldz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.521305 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06483596-1371-43c1-897a-5ff9af3639ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06483596-1371-43c1-897a-5ff9af3639ad" (UID: "06483596-1371-43c1-897a-5ff9af3639ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.599484 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06483596-1371-43c1-897a-5ff9af3639ad-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.599526 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2ldz\" (UniqueName: \"kubernetes.io/projected/06483596-1371-43c1-897a-5ff9af3639ad-kube-api-access-w2ldz\") on node \"crc\" DevicePath \"\"" Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.794020 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.905888 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htlhk\" (UniqueName: \"kubernetes.io/projected/6367602c-669d-474f-bd56-97c1b58659b4-kube-api-access-htlhk\") pod \"6367602c-669d-474f-bd56-97c1b58659b4\" (UID: \"6367602c-669d-474f-bd56-97c1b58659b4\") " Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.906037 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6367602c-669d-474f-bd56-97c1b58659b4-inventory\") pod \"6367602c-669d-474f-bd56-97c1b58659b4\" (UID: \"6367602c-669d-474f-bd56-97c1b58659b4\") " Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.906219 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6367602c-669d-474f-bd56-97c1b58659b4-ssh-key\") pod \"6367602c-669d-474f-bd56-97c1b58659b4\" (UID: \"6367602c-669d-474f-bd56-97c1b58659b4\") " Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.910740 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6367602c-669d-474f-bd56-97c1b58659b4-kube-api-access-htlhk" (OuterVolumeSpecName: "kube-api-access-htlhk") pod "6367602c-669d-474f-bd56-97c1b58659b4" (UID: "6367602c-669d-474f-bd56-97c1b58659b4"). InnerVolumeSpecName "kube-api-access-htlhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.942770 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6367602c-669d-474f-bd56-97c1b58659b4-inventory" (OuterVolumeSpecName: "inventory") pod "6367602c-669d-474f-bd56-97c1b58659b4" (UID: "6367602c-669d-474f-bd56-97c1b58659b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:35:15 crc kubenswrapper[4776]: I1208 09:35:15.943278 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6367602c-669d-474f-bd56-97c1b58659b4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6367602c-669d-474f-bd56-97c1b58659b4" (UID: "6367602c-669d-474f-bd56-97c1b58659b4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.009399 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6367602c-669d-474f-bd56-97c1b58659b4-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.009631 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6367602c-669d-474f-bd56-97c1b58659b4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.009710 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htlhk\" (UniqueName: \"kubernetes.io/projected/6367602c-669d-474f-bd56-97c1b58659b4-kube-api-access-htlhk\") on node \"crc\" DevicePath \"\"" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.304384 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rjn4" event={"ID":"06483596-1371-43c1-897a-5ff9af3639ad","Type":"ContainerDied","Data":"d4e3bfed2b0a273166d35a46a3ff87755eb19acba5a226c38b7680864163af7b"} Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.304422 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7rjn4" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.304452 4776 scope.go:117] "RemoveContainer" containerID="22653f47f72fdd9b405d4c80e308eb15ed0af3575f571e91dfcae01c840a36f0" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.307812 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" event={"ID":"6367602c-669d-474f-bd56-97c1b58659b4","Type":"ContainerDied","Data":"f2cb760cdc26eba5b312b1ba4530848943cbf29dbd3a71b1457bd6f2bef37998"} Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.307843 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2cb760cdc26eba5b312b1ba4530848943cbf29dbd3a71b1457bd6f2bef37998" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.307903 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.343272 4776 scope.go:117] "RemoveContainer" containerID="1107f85faa03e52fb1e2f2662285039b758f3487e1795fec3f28514935b89f26" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.370870 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rjn4"] Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.371059 4776 scope.go:117] "RemoveContainer" containerID="d673cd1c187f52dc35f3c426e8a51ae91ab889ab83df5c7f5504b9368ab90044" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.393626 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rjn4"] Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.415253 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf"] Dec 08 09:35:16 crc kubenswrapper[4776]: E1208 09:35:16.415910 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06483596-1371-43c1-897a-5ff9af3639ad" containerName="extract-utilities" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.415936 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="06483596-1371-43c1-897a-5ff9af3639ad" containerName="extract-utilities" Dec 08 09:35:16 crc kubenswrapper[4776]: E1208 09:35:16.415968 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06483596-1371-43c1-897a-5ff9af3639ad" containerName="extract-content" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.415979 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="06483596-1371-43c1-897a-5ff9af3639ad" containerName="extract-content" Dec 08 09:35:16 crc kubenswrapper[4776]: E1208 09:35:16.416002 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6367602c-669d-474f-bd56-97c1b58659b4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.416013 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6367602c-669d-474f-bd56-97c1b58659b4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 08 09:35:16 crc kubenswrapper[4776]: E1208 09:35:16.416062 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06483596-1371-43c1-897a-5ff9af3639ad" containerName="registry-server" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.416072 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="06483596-1371-43c1-897a-5ff9af3639ad" containerName="registry-server" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.416429 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6367602c-669d-474f-bd56-97c1b58659b4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.416453 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="06483596-1371-43c1-897a-5ff9af3639ad" containerName="registry-server" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.417479 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.420442 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.420510 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.420769 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.420888 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.424231 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf"] Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.524515 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8284a3c-c72c-41f5-aefe-bbc881bf969b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-js6sf\" (UID: \"d8284a3c-c72c-41f5-aefe-bbc881bf969b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.524700 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdrrn\" (UniqueName: \"kubernetes.io/projected/d8284a3c-c72c-41f5-aefe-bbc881bf969b-kube-api-access-cdrrn\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-js6sf\" (UID: \"d8284a3c-c72c-41f5-aefe-bbc881bf969b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.524753 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8284a3c-c72c-41f5-aefe-bbc881bf969b-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-js6sf\" (UID: \"d8284a3c-c72c-41f5-aefe-bbc881bf969b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.626710 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8284a3c-c72c-41f5-aefe-bbc881bf969b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-js6sf\" (UID: \"d8284a3c-c72c-41f5-aefe-bbc881bf969b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.626899 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdrrn\" (UniqueName: \"kubernetes.io/projected/d8284a3c-c72c-41f5-aefe-bbc881bf969b-kube-api-access-cdrrn\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-js6sf\" (UID: \"d8284a3c-c72c-41f5-aefe-bbc881bf969b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.626959 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8284a3c-c72c-41f5-aefe-bbc881bf969b-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-js6sf\" (UID: \"d8284a3c-c72c-41f5-aefe-bbc881bf969b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.631401 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8284a3c-c72c-41f5-aefe-bbc881bf969b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-js6sf\" (UID: \"d8284a3c-c72c-41f5-aefe-bbc881bf969b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.636800 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8284a3c-c72c-41f5-aefe-bbc881bf969b-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-js6sf\" (UID: \"d8284a3c-c72c-41f5-aefe-bbc881bf969b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.654215 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdrrn\" (UniqueName: \"kubernetes.io/projected/d8284a3c-c72c-41f5-aefe-bbc881bf969b-kube-api-access-cdrrn\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-js6sf\" (UID: \"d8284a3c-c72c-41f5-aefe-bbc881bf969b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.792022 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.935288 4776 scope.go:117] "RemoveContainer" containerID="d1862aea8ebe37a2abba47d91654e83e009dcecd2a5485b55265dd485cc62a2f" Dec 08 09:35:16 crc kubenswrapper[4776]: I1208 09:35:16.995430 4776 scope.go:117] "RemoveContainer" containerID="35ec05672b585df9de15c49b45532bbf2ada66b7ec860aa22ae4af9d93e7c753" Dec 08 09:35:17 crc kubenswrapper[4776]: I1208 09:35:17.043574 4776 scope.go:117] "RemoveContainer" containerID="205e021d4497e71dfc63f8de58525faaed65ab904a2219d4ef0a157f7eb5b478" Dec 08 09:35:17 crc kubenswrapper[4776]: I1208 09:35:17.085040 4776 scope.go:117] "RemoveContainer" containerID="73991a1a2851bba40093ac2a4e54caa630682b8f3818c2abd7fd856925f66b75" Dec 08 09:35:17 crc kubenswrapper[4776]: I1208 09:35:17.112564 4776 scope.go:117] "RemoveContainer" containerID="169847b89dab5b9ed6111372c125a1d8e045c63c643ef8c2308b0723a3d90f5b" Dec 08 09:35:17 crc kubenswrapper[4776]: I1208 09:35:17.143579 4776 scope.go:117] "RemoveContainer" containerID="84333277d15abd7eb0d265bb97fe6f0126f6e880dcaf9f91bed4b0c84924c642" Dec 08 09:35:17 crc kubenswrapper[4776]: I1208 09:35:17.165664 4776 scope.go:117] "RemoveContainer" containerID="84a61abbbed46adc7d811a570f5178ad218c9fa9fd7845d2b2933812c18fcde8" Dec 08 09:35:17 crc kubenswrapper[4776]: I1208 09:35:17.185540 4776 scope.go:117] "RemoveContainer" containerID="cc0c566bd52ec36e88d85048dbc79441de63453e7cb604cd8b86a985de6b084e" Dec 08 09:35:17 crc kubenswrapper[4776]: I1208 09:35:17.209281 4776 scope.go:117] "RemoveContainer" containerID="0f2480a5437008e2df0d4f7b21b8d4db2fc224e3fe12a6cd0d32952ed3dd83b0" Dec 08 09:35:17 crc kubenswrapper[4776]: I1208 09:35:17.500839 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf"] Dec 08 09:35:18 crc kubenswrapper[4776]: I1208 09:35:18.362371 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06483596-1371-43c1-897a-5ff9af3639ad" path="/var/lib/kubelet/pods/06483596-1371-43c1-897a-5ff9af3639ad/volumes" Dec 08 09:35:18 crc kubenswrapper[4776]: I1208 09:35:18.363653 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" event={"ID":"d8284a3c-c72c-41f5-aefe-bbc881bf969b","Type":"ContainerStarted","Data":"a913de70503982873c90649fd4ceb1a34708104a1ef5d9f28b569d0425b9ff09"} Dec 08 09:35:19 crc kubenswrapper[4776]: I1208 09:35:19.743235 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dl2hd"] Dec 08 09:35:19 crc kubenswrapper[4776]: I1208 09:35:19.752688 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dl2hd"] Dec 08 09:35:20 crc kubenswrapper[4776]: I1208 09:35:20.040830 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-v54bb"] Dec 08 09:35:20 crc kubenswrapper[4776]: I1208 09:35:20.053877 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-v54bb"] Dec 08 09:35:20 crc kubenswrapper[4776]: I1208 09:35:20.360872 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5756d118-f614-4000-82d2-ffa1623179cd" path="/var/lib/kubelet/pods/5756d118-f614-4000-82d2-ffa1623179cd/volumes" Dec 08 09:35:20 crc kubenswrapper[4776]: I1208 09:35:20.362529 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d88c7e2b-caa4-4d68-acc2-1483da2dfef3" path="/var/lib/kubelet/pods/d88c7e2b-caa4-4d68-acc2-1483da2dfef3/volumes" Dec 08 09:35:20 crc kubenswrapper[4776]: I1208 09:35:20.380346 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" event={"ID":"d8284a3c-c72c-41f5-aefe-bbc881bf969b","Type":"ContainerStarted","Data":"b906ec9d91e94c099978556f468c062fd592d59e8ee5795d168be899613e2cd2"} Dec 08 09:35:26 crc kubenswrapper[4776]: I1208 09:35:26.440687 4776 generic.go:334] "Generic (PLEG): container finished" podID="d8284a3c-c72c-41f5-aefe-bbc881bf969b" containerID="b906ec9d91e94c099978556f468c062fd592d59e8ee5795d168be899613e2cd2" exitCode=0 Dec 08 09:35:26 crc kubenswrapper[4776]: I1208 09:35:26.440772 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" event={"ID":"d8284a3c-c72c-41f5-aefe-bbc881bf969b","Type":"ContainerDied","Data":"b906ec9d91e94c099978556f468c062fd592d59e8ee5795d168be899613e2cd2"} Dec 08 09:35:27 crc kubenswrapper[4776]: I1208 09:35:27.928791 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.026665 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdrrn\" (UniqueName: \"kubernetes.io/projected/d8284a3c-c72c-41f5-aefe-bbc881bf969b-kube-api-access-cdrrn\") pod \"d8284a3c-c72c-41f5-aefe-bbc881bf969b\" (UID: \"d8284a3c-c72c-41f5-aefe-bbc881bf969b\") " Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.027088 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8284a3c-c72c-41f5-aefe-bbc881bf969b-ssh-key\") pod \"d8284a3c-c72c-41f5-aefe-bbc881bf969b\" (UID: \"d8284a3c-c72c-41f5-aefe-bbc881bf969b\") " Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.027287 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8284a3c-c72c-41f5-aefe-bbc881bf969b-inventory\") pod \"d8284a3c-c72c-41f5-aefe-bbc881bf969b\" (UID: \"d8284a3c-c72c-41f5-aefe-bbc881bf969b\") " Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.032593 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8284a3c-c72c-41f5-aefe-bbc881bf969b-kube-api-access-cdrrn" (OuterVolumeSpecName: "kube-api-access-cdrrn") pod "d8284a3c-c72c-41f5-aefe-bbc881bf969b" (UID: "d8284a3c-c72c-41f5-aefe-bbc881bf969b"). InnerVolumeSpecName "kube-api-access-cdrrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.061093 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8284a3c-c72c-41f5-aefe-bbc881bf969b-inventory" (OuterVolumeSpecName: "inventory") pod "d8284a3c-c72c-41f5-aefe-bbc881bf969b" (UID: "d8284a3c-c72c-41f5-aefe-bbc881bf969b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.068416 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8284a3c-c72c-41f5-aefe-bbc881bf969b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d8284a3c-c72c-41f5-aefe-bbc881bf969b" (UID: "d8284a3c-c72c-41f5-aefe-bbc881bf969b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.130307 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8284a3c-c72c-41f5-aefe-bbc881bf969b-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.130340 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdrrn\" (UniqueName: \"kubernetes.io/projected/d8284a3c-c72c-41f5-aefe-bbc881bf969b-kube-api-access-cdrrn\") on node \"crc\" DevicePath \"\"" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.130355 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8284a3c-c72c-41f5-aefe-bbc881bf969b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.466203 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" event={"ID":"d8284a3c-c72c-41f5-aefe-bbc881bf969b","Type":"ContainerDied","Data":"a913de70503982873c90649fd4ceb1a34708104a1ef5d9f28b569d0425b9ff09"} Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.466249 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a913de70503982873c90649fd4ceb1a34708104a1ef5d9f28b569d0425b9ff09" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.466276 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-js6sf" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.546422 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7"] Dec 08 09:35:28 crc kubenswrapper[4776]: E1208 09:35:28.547147 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8284a3c-c72c-41f5-aefe-bbc881bf969b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.547246 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8284a3c-c72c-41f5-aefe-bbc881bf969b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.547586 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8284a3c-c72c-41f5-aefe-bbc881bf969b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.548564 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.550713 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.553572 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.553748 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.553913 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.561154 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7"] Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.642113 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f67c7d60-bc4d-4712-a8d9-acb48e097264-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tknt7\" (UID: \"f67c7d60-bc4d-4712-a8d9-acb48e097264\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.642188 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f67c7d60-bc4d-4712-a8d9-acb48e097264-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tknt7\" (UID: \"f67c7d60-bc4d-4712-a8d9-acb48e097264\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.642266 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh8wx\" (UniqueName: \"kubernetes.io/projected/f67c7d60-bc4d-4712-a8d9-acb48e097264-kube-api-access-zh8wx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tknt7\" (UID: \"f67c7d60-bc4d-4712-a8d9-acb48e097264\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.744736 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh8wx\" (UniqueName: \"kubernetes.io/projected/f67c7d60-bc4d-4712-a8d9-acb48e097264-kube-api-access-zh8wx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tknt7\" (UID: \"f67c7d60-bc4d-4712-a8d9-acb48e097264\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.745385 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f67c7d60-bc4d-4712-a8d9-acb48e097264-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tknt7\" (UID: \"f67c7d60-bc4d-4712-a8d9-acb48e097264\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.745545 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f67c7d60-bc4d-4712-a8d9-acb48e097264-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tknt7\" (UID: \"f67c7d60-bc4d-4712-a8d9-acb48e097264\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.749619 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f67c7d60-bc4d-4712-a8d9-acb48e097264-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tknt7\" (UID: \"f67c7d60-bc4d-4712-a8d9-acb48e097264\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.752805 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f67c7d60-bc4d-4712-a8d9-acb48e097264-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tknt7\" (UID: \"f67c7d60-bc4d-4712-a8d9-acb48e097264\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.761652 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh8wx\" (UniqueName: \"kubernetes.io/projected/f67c7d60-bc4d-4712-a8d9-acb48e097264-kube-api-access-zh8wx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tknt7\" (UID: \"f67c7d60-bc4d-4712-a8d9-acb48e097264\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" Dec 08 09:35:28 crc kubenswrapper[4776]: I1208 09:35:28.870340 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" Dec 08 09:35:29 crc kubenswrapper[4776]: I1208 09:35:29.448786 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7"] Dec 08 09:35:29 crc kubenswrapper[4776]: W1208 09:35:29.452572 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf67c7d60_bc4d_4712_a8d9_acb48e097264.slice/crio-26a8b99ed65ae41b63ff011fc929643968a236ef5d73dcc6725eff5fd481381f WatchSource:0}: Error finding container 26a8b99ed65ae41b63ff011fc929643968a236ef5d73dcc6725eff5fd481381f: Status 404 returned error can't find the container with id 26a8b99ed65ae41b63ff011fc929643968a236ef5d73dcc6725eff5fd481381f Dec 08 09:35:29 crc kubenswrapper[4776]: I1208 09:35:29.481084 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" event={"ID":"f67c7d60-bc4d-4712-a8d9-acb48e097264","Type":"ContainerStarted","Data":"26a8b99ed65ae41b63ff011fc929643968a236ef5d73dcc6725eff5fd481381f"} Dec 08 09:35:30 crc kubenswrapper[4776]: I1208 09:35:30.490992 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" event={"ID":"f67c7d60-bc4d-4712-a8d9-acb48e097264","Type":"ContainerStarted","Data":"61ed4e39dfe61e1e1a610ebdfc8c50875b5be60ffbf117b798d5ac2dc3075a3d"} Dec 08 09:35:30 crc kubenswrapper[4776]: I1208 09:35:30.521222 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" podStartSLOduration=2.085648869 podStartE2EDuration="2.521199948s" podCreationTimestamp="2025-12-08 09:35:28 +0000 UTC" firstStartedPulling="2025-12-08 09:35:29.45477193 +0000 UTC m=+2205.717996952" lastFinishedPulling="2025-12-08 09:35:29.890323009 +0000 UTC m=+2206.153548031" observedRunningTime="2025-12-08 09:35:30.509908284 +0000 UTC m=+2206.773133336" watchObservedRunningTime="2025-12-08 09:35:30.521199948 +0000 UTC m=+2206.784424970" Dec 08 09:35:41 crc kubenswrapper[4776]: I1208 09:35:41.399141 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:35:41 crc kubenswrapper[4776]: I1208 09:35:41.399723 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:35:51 crc kubenswrapper[4776]: I1208 09:35:51.122815 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xnpck"] Dec 08 09:35:51 crc kubenswrapper[4776]: I1208 09:35:51.125931 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:35:51 crc kubenswrapper[4776]: I1208 09:35:51.150443 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xnpck"] Dec 08 09:35:51 crc kubenswrapper[4776]: I1208 09:35:51.308976 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-catalog-content\") pod \"certified-operators-xnpck\" (UID: \"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9\") " pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:35:51 crc kubenswrapper[4776]: I1208 09:35:51.309108 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-utilities\") pod \"certified-operators-xnpck\" (UID: \"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9\") " pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:35:51 crc kubenswrapper[4776]: I1208 09:35:51.309194 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh4fx\" (UniqueName: \"kubernetes.io/projected/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-kube-api-access-dh4fx\") pod \"certified-operators-xnpck\" (UID: \"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9\") " pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:35:51 crc kubenswrapper[4776]: I1208 09:35:51.412433 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-utilities\") pod \"certified-operators-xnpck\" (UID: \"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9\") " pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:35:51 crc kubenswrapper[4776]: I1208 09:35:51.412489 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh4fx\" (UniqueName: \"kubernetes.io/projected/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-kube-api-access-dh4fx\") pod \"certified-operators-xnpck\" (UID: \"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9\") " pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:35:51 crc kubenswrapper[4776]: I1208 09:35:51.412721 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-catalog-content\") pod \"certified-operators-xnpck\" (UID: \"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9\") " pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:35:51 crc kubenswrapper[4776]: I1208 09:35:51.413025 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-utilities\") pod \"certified-operators-xnpck\" (UID: \"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9\") " pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:35:51 crc kubenswrapper[4776]: I1208 09:35:51.413040 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-catalog-content\") pod \"certified-operators-xnpck\" (UID: \"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9\") " pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:35:51 crc kubenswrapper[4776]: I1208 09:35:51.443084 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh4fx\" (UniqueName: \"kubernetes.io/projected/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-kube-api-access-dh4fx\") pod \"certified-operators-xnpck\" (UID: \"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9\") " pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:35:51 crc kubenswrapper[4776]: I1208 09:35:51.451987 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:35:51 crc kubenswrapper[4776]: I1208 09:35:51.979730 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xnpck"] Dec 08 09:35:52 crc kubenswrapper[4776]: I1208 09:35:52.736495 4776 generic.go:334] "Generic (PLEG): container finished" podID="cdc3850e-c90d-4cd6-bc5a-016a973ae9f9" containerID="a214922854a942876b11723e34654cf3fc8cf788c3ff0e843aaab69a6d419988" exitCode=0 Dec 08 09:35:52 crc kubenswrapper[4776]: I1208 09:35:52.736597 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnpck" event={"ID":"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9","Type":"ContainerDied","Data":"a214922854a942876b11723e34654cf3fc8cf788c3ff0e843aaab69a6d419988"} Dec 08 09:35:52 crc kubenswrapper[4776]: I1208 09:35:52.736818 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnpck" event={"ID":"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9","Type":"ContainerStarted","Data":"bd5a5bc564ef17dc8232aa746e065f09b279a4d74a15798f3d48be4d19844393"} Dec 08 09:35:53 crc kubenswrapper[4776]: I1208 09:35:53.750205 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnpck" event={"ID":"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9","Type":"ContainerStarted","Data":"346c579220d47e53b9247fbb19945d3028c0125930478c6a620d8dccc3fc045f"} Dec 08 09:35:55 crc kubenswrapper[4776]: I1208 09:35:55.774259 4776 generic.go:334] "Generic (PLEG): container finished" podID="cdc3850e-c90d-4cd6-bc5a-016a973ae9f9" containerID="346c579220d47e53b9247fbb19945d3028c0125930478c6a620d8dccc3fc045f" exitCode=0 Dec 08 09:35:55 crc kubenswrapper[4776]: I1208 09:35:55.774313 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnpck" event={"ID":"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9","Type":"ContainerDied","Data":"346c579220d47e53b9247fbb19945d3028c0125930478c6a620d8dccc3fc045f"} Dec 08 09:35:56 crc kubenswrapper[4776]: I1208 09:35:56.786598 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnpck" event={"ID":"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9","Type":"ContainerStarted","Data":"10918e393a04d73ca2245e0c314d2142fde28091b321d7f2df993a8a94d1df10"} Dec 08 09:35:56 crc kubenswrapper[4776]: I1208 09:35:56.810931 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xnpck" podStartSLOduration=2.344498687 podStartE2EDuration="5.810908545s" podCreationTimestamp="2025-12-08 09:35:51 +0000 UTC" firstStartedPulling="2025-12-08 09:35:52.740030488 +0000 UTC m=+2229.003255520" lastFinishedPulling="2025-12-08 09:35:56.206440356 +0000 UTC m=+2232.469665378" observedRunningTime="2025-12-08 09:35:56.804589444 +0000 UTC m=+2233.067814476" watchObservedRunningTime="2025-12-08 09:35:56.810908545 +0000 UTC m=+2233.074133567" Dec 08 09:36:01 crc kubenswrapper[4776]: I1208 09:36:01.452754 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:36:01 crc kubenswrapper[4776]: I1208 09:36:01.453115 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:36:01 crc kubenswrapper[4776]: I1208 09:36:01.513813 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:36:01 crc kubenswrapper[4776]: I1208 09:36:01.907079 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:36:01 crc kubenswrapper[4776]: I1208 09:36:01.964000 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xnpck"] Dec 08 09:36:02 crc kubenswrapper[4776]: I1208 09:36:02.039221 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-kdpqs"] Dec 08 09:36:02 crc kubenswrapper[4776]: I1208 09:36:02.050627 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-kdpqs"] Dec 08 09:36:02 crc kubenswrapper[4776]: I1208 09:36:02.364314 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc988dd0-b7f8-4793-8922-238ec7c3081b" path="/var/lib/kubelet/pods/bc988dd0-b7f8-4793-8922-238ec7c3081b/volumes" Dec 08 09:36:03 crc kubenswrapper[4776]: I1208 09:36:03.876135 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xnpck" podUID="cdc3850e-c90d-4cd6-bc5a-016a973ae9f9" containerName="registry-server" containerID="cri-o://10918e393a04d73ca2245e0c314d2142fde28091b321d7f2df993a8a94d1df10" gracePeriod=2 Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.451440 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.634567 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-catalog-content\") pod \"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9\" (UID: \"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9\") " Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.634878 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-utilities\") pod \"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9\" (UID: \"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9\") " Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.634917 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh4fx\" (UniqueName: \"kubernetes.io/projected/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-kube-api-access-dh4fx\") pod \"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9\" (UID: \"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9\") " Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.636053 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-utilities" (OuterVolumeSpecName: "utilities") pod "cdc3850e-c90d-4cd6-bc5a-016a973ae9f9" (UID: "cdc3850e-c90d-4cd6-bc5a-016a973ae9f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.645256 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-kube-api-access-dh4fx" (OuterVolumeSpecName: "kube-api-access-dh4fx") pod "cdc3850e-c90d-4cd6-bc5a-016a973ae9f9" (UID: "cdc3850e-c90d-4cd6-bc5a-016a973ae9f9"). InnerVolumeSpecName "kube-api-access-dh4fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.676216 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cdc3850e-c90d-4cd6-bc5a-016a973ae9f9" (UID: "cdc3850e-c90d-4cd6-bc5a-016a973ae9f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.738212 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.738538 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.738660 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh4fx\" (UniqueName: \"kubernetes.io/projected/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9-kube-api-access-dh4fx\") on node \"crc\" DevicePath \"\"" Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.889257 4776 generic.go:334] "Generic (PLEG): container finished" podID="cdc3850e-c90d-4cd6-bc5a-016a973ae9f9" containerID="10918e393a04d73ca2245e0c314d2142fde28091b321d7f2df993a8a94d1df10" exitCode=0 Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.889318 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnpck" event={"ID":"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9","Type":"ContainerDied","Data":"10918e393a04d73ca2245e0c314d2142fde28091b321d7f2df993a8a94d1df10"} Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.889352 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnpck" event={"ID":"cdc3850e-c90d-4cd6-bc5a-016a973ae9f9","Type":"ContainerDied","Data":"bd5a5bc564ef17dc8232aa746e065f09b279a4d74a15798f3d48be4d19844393"} Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.889385 4776 scope.go:117] "RemoveContainer" containerID="10918e393a04d73ca2245e0c314d2142fde28091b321d7f2df993a8a94d1df10" Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.890266 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xnpck" Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.914444 4776 scope.go:117] "RemoveContainer" containerID="346c579220d47e53b9247fbb19945d3028c0125930478c6a620d8dccc3fc045f" Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.925403 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xnpck"] Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.937361 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xnpck"] Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.951855 4776 scope.go:117] "RemoveContainer" containerID="a214922854a942876b11723e34654cf3fc8cf788c3ff0e843aaab69a6d419988" Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.990120 4776 scope.go:117] "RemoveContainer" containerID="10918e393a04d73ca2245e0c314d2142fde28091b321d7f2df993a8a94d1df10" Dec 08 09:36:04 crc kubenswrapper[4776]: E1208 09:36:04.990574 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10918e393a04d73ca2245e0c314d2142fde28091b321d7f2df993a8a94d1df10\": container with ID starting with 10918e393a04d73ca2245e0c314d2142fde28091b321d7f2df993a8a94d1df10 not found: ID does not exist" containerID="10918e393a04d73ca2245e0c314d2142fde28091b321d7f2df993a8a94d1df10" Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.990612 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10918e393a04d73ca2245e0c314d2142fde28091b321d7f2df993a8a94d1df10"} err="failed to get container status \"10918e393a04d73ca2245e0c314d2142fde28091b321d7f2df993a8a94d1df10\": rpc error: code = NotFound desc = could not find container \"10918e393a04d73ca2245e0c314d2142fde28091b321d7f2df993a8a94d1df10\": container with ID starting with 10918e393a04d73ca2245e0c314d2142fde28091b321d7f2df993a8a94d1df10 not found: ID does not exist" Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.990634 4776 scope.go:117] "RemoveContainer" containerID="346c579220d47e53b9247fbb19945d3028c0125930478c6a620d8dccc3fc045f" Dec 08 09:36:04 crc kubenswrapper[4776]: E1208 09:36:04.990917 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"346c579220d47e53b9247fbb19945d3028c0125930478c6a620d8dccc3fc045f\": container with ID starting with 346c579220d47e53b9247fbb19945d3028c0125930478c6a620d8dccc3fc045f not found: ID does not exist" containerID="346c579220d47e53b9247fbb19945d3028c0125930478c6a620d8dccc3fc045f" Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.990947 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346c579220d47e53b9247fbb19945d3028c0125930478c6a620d8dccc3fc045f"} err="failed to get container status \"346c579220d47e53b9247fbb19945d3028c0125930478c6a620d8dccc3fc045f\": rpc error: code = NotFound desc = could not find container \"346c579220d47e53b9247fbb19945d3028c0125930478c6a620d8dccc3fc045f\": container with ID starting with 346c579220d47e53b9247fbb19945d3028c0125930478c6a620d8dccc3fc045f not found: ID does not exist" Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.990971 4776 scope.go:117] "RemoveContainer" containerID="a214922854a942876b11723e34654cf3fc8cf788c3ff0e843aaab69a6d419988" Dec 08 09:36:04 crc kubenswrapper[4776]: E1208 09:36:04.991323 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a214922854a942876b11723e34654cf3fc8cf788c3ff0e843aaab69a6d419988\": container with ID starting with a214922854a942876b11723e34654cf3fc8cf788c3ff0e843aaab69a6d419988 not found: ID does not exist" containerID="a214922854a942876b11723e34654cf3fc8cf788c3ff0e843aaab69a6d419988" Dec 08 09:36:04 crc kubenswrapper[4776]: I1208 09:36:04.991345 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a214922854a942876b11723e34654cf3fc8cf788c3ff0e843aaab69a6d419988"} err="failed to get container status \"a214922854a942876b11723e34654cf3fc8cf788c3ff0e843aaab69a6d419988\": rpc error: code = NotFound desc = could not find container \"a214922854a942876b11723e34654cf3fc8cf788c3ff0e843aaab69a6d419988\": container with ID starting with a214922854a942876b11723e34654cf3fc8cf788c3ff0e843aaab69a6d419988 not found: ID does not exist" Dec 08 09:36:06 crc kubenswrapper[4776]: I1208 09:36:06.356955 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdc3850e-c90d-4cd6-bc5a-016a973ae9f9" path="/var/lib/kubelet/pods/cdc3850e-c90d-4cd6-bc5a-016a973ae9f9/volumes" Dec 08 09:36:06 crc kubenswrapper[4776]: I1208 09:36:06.916280 4776 generic.go:334] "Generic (PLEG): container finished" podID="f67c7d60-bc4d-4712-a8d9-acb48e097264" containerID="61ed4e39dfe61e1e1a610ebdfc8c50875b5be60ffbf117b798d5ac2dc3075a3d" exitCode=0 Dec 08 09:36:06 crc kubenswrapper[4776]: I1208 09:36:06.916324 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" event={"ID":"f67c7d60-bc4d-4712-a8d9-acb48e097264","Type":"ContainerDied","Data":"61ed4e39dfe61e1e1a610ebdfc8c50875b5be60ffbf117b798d5ac2dc3075a3d"} Dec 08 09:36:08 crc kubenswrapper[4776]: I1208 09:36:08.380392 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" Dec 08 09:36:08 crc kubenswrapper[4776]: I1208 09:36:08.519351 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f67c7d60-bc4d-4712-a8d9-acb48e097264-inventory\") pod \"f67c7d60-bc4d-4712-a8d9-acb48e097264\" (UID: \"f67c7d60-bc4d-4712-a8d9-acb48e097264\") " Dec 08 09:36:08 crc kubenswrapper[4776]: I1208 09:36:08.519402 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh8wx\" (UniqueName: \"kubernetes.io/projected/f67c7d60-bc4d-4712-a8d9-acb48e097264-kube-api-access-zh8wx\") pod \"f67c7d60-bc4d-4712-a8d9-acb48e097264\" (UID: \"f67c7d60-bc4d-4712-a8d9-acb48e097264\") " Dec 08 09:36:08 crc kubenswrapper[4776]: I1208 09:36:08.520580 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f67c7d60-bc4d-4712-a8d9-acb48e097264-ssh-key\") pod \"f67c7d60-bc4d-4712-a8d9-acb48e097264\" (UID: \"f67c7d60-bc4d-4712-a8d9-acb48e097264\") " Dec 08 09:36:08 crc kubenswrapper[4776]: I1208 09:36:08.524919 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f67c7d60-bc4d-4712-a8d9-acb48e097264-kube-api-access-zh8wx" (OuterVolumeSpecName: "kube-api-access-zh8wx") pod "f67c7d60-bc4d-4712-a8d9-acb48e097264" (UID: "f67c7d60-bc4d-4712-a8d9-acb48e097264"). InnerVolumeSpecName "kube-api-access-zh8wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:36:08 crc kubenswrapper[4776]: I1208 09:36:08.554158 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f67c7d60-bc4d-4712-a8d9-acb48e097264-inventory" (OuterVolumeSpecName: "inventory") pod "f67c7d60-bc4d-4712-a8d9-acb48e097264" (UID: "f67c7d60-bc4d-4712-a8d9-acb48e097264"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:36:08 crc kubenswrapper[4776]: I1208 09:36:08.557220 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f67c7d60-bc4d-4712-a8d9-acb48e097264-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f67c7d60-bc4d-4712-a8d9-acb48e097264" (UID: "f67c7d60-bc4d-4712-a8d9-acb48e097264"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:36:08 crc kubenswrapper[4776]: I1208 09:36:08.624673 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f67c7d60-bc4d-4712-a8d9-acb48e097264-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:36:08 crc kubenswrapper[4776]: I1208 09:36:08.624717 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh8wx\" (UniqueName: \"kubernetes.io/projected/f67c7d60-bc4d-4712-a8d9-acb48e097264-kube-api-access-zh8wx\") on node \"crc\" DevicePath \"\"" Dec 08 09:36:08 crc kubenswrapper[4776]: I1208 09:36:08.624731 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f67c7d60-bc4d-4712-a8d9-acb48e097264-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:36:08 crc kubenswrapper[4776]: I1208 09:36:08.937591 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" event={"ID":"f67c7d60-bc4d-4712-a8d9-acb48e097264","Type":"ContainerDied","Data":"26a8b99ed65ae41b63ff011fc929643968a236ef5d73dcc6725eff5fd481381f"} Dec 08 09:36:08 crc kubenswrapper[4776]: I1208 09:36:08.937634 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26a8b99ed65ae41b63ff011fc929643968a236ef5d73dcc6725eff5fd481381f" Dec 08 09:36:08 crc kubenswrapper[4776]: I1208 09:36:08.937652 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tknt7" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.022865 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7"] Dec 08 09:36:09 crc kubenswrapper[4776]: E1208 09:36:09.023581 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc3850e-c90d-4cd6-bc5a-016a973ae9f9" containerName="registry-server" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.023656 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc3850e-c90d-4cd6-bc5a-016a973ae9f9" containerName="registry-server" Dec 08 09:36:09 crc kubenswrapper[4776]: E1208 09:36:09.023715 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc3850e-c90d-4cd6-bc5a-016a973ae9f9" containerName="extract-utilities" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.023765 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc3850e-c90d-4cd6-bc5a-016a973ae9f9" containerName="extract-utilities" Dec 08 09:36:09 crc kubenswrapper[4776]: E1208 09:36:09.023845 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc3850e-c90d-4cd6-bc5a-016a973ae9f9" containerName="extract-content" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.023899 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc3850e-c90d-4cd6-bc5a-016a973ae9f9" containerName="extract-content" Dec 08 09:36:09 crc kubenswrapper[4776]: E1208 09:36:09.023974 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f67c7d60-bc4d-4712-a8d9-acb48e097264" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.024027 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f67c7d60-bc4d-4712-a8d9-acb48e097264" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.024363 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f67c7d60-bc4d-4712-a8d9-acb48e097264" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.024468 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdc3850e-c90d-4cd6-bc5a-016a973ae9f9" containerName="registry-server" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.025389 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.028457 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.028989 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.029166 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.029349 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.034208 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7"] Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.135708 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c92123e3-056d-4e4f-83b1-3cf335342a70-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7\" (UID: \"c92123e3-056d-4e4f-83b1-3cf335342a70\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.136086 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shndg\" (UniqueName: \"kubernetes.io/projected/c92123e3-056d-4e4f-83b1-3cf335342a70-kube-api-access-shndg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7\" (UID: \"c92123e3-056d-4e4f-83b1-3cf335342a70\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.136146 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c92123e3-056d-4e4f-83b1-3cf335342a70-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7\" (UID: \"c92123e3-056d-4e4f-83b1-3cf335342a70\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.238652 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c92123e3-056d-4e4f-83b1-3cf335342a70-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7\" (UID: \"c92123e3-056d-4e4f-83b1-3cf335342a70\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.238758 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shndg\" (UniqueName: \"kubernetes.io/projected/c92123e3-056d-4e4f-83b1-3cf335342a70-kube-api-access-shndg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7\" (UID: \"c92123e3-056d-4e4f-83b1-3cf335342a70\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.238818 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c92123e3-056d-4e4f-83b1-3cf335342a70-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7\" (UID: \"c92123e3-056d-4e4f-83b1-3cf335342a70\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.244507 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c92123e3-056d-4e4f-83b1-3cf335342a70-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7\" (UID: \"c92123e3-056d-4e4f-83b1-3cf335342a70\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.244714 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c92123e3-056d-4e4f-83b1-3cf335342a70-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7\" (UID: \"c92123e3-056d-4e4f-83b1-3cf335342a70\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.254845 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shndg\" (UniqueName: \"kubernetes.io/projected/c92123e3-056d-4e4f-83b1-3cf335342a70-kube-api-access-shndg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7\" (UID: \"c92123e3-056d-4e4f-83b1-3cf335342a70\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.361121 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.896053 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7"] Dec 08 09:36:09 crc kubenswrapper[4776]: W1208 09:36:09.904426 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc92123e3_056d_4e4f_83b1_3cf335342a70.slice/crio-74350d73644bbfd319c2b4e66328f878a70d1a5237f57c2e73d83e6d2d772b32 WatchSource:0}: Error finding container 74350d73644bbfd319c2b4e66328f878a70d1a5237f57c2e73d83e6d2d772b32: Status 404 returned error can't find the container with id 74350d73644bbfd319c2b4e66328f878a70d1a5237f57c2e73d83e6d2d772b32 Dec 08 09:36:09 crc kubenswrapper[4776]: I1208 09:36:09.950062 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" event={"ID":"c92123e3-056d-4e4f-83b1-3cf335342a70","Type":"ContainerStarted","Data":"74350d73644bbfd319c2b4e66328f878a70d1a5237f57c2e73d83e6d2d772b32"} Dec 08 09:36:10 crc kubenswrapper[4776]: I1208 09:36:10.965366 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" event={"ID":"c92123e3-056d-4e4f-83b1-3cf335342a70","Type":"ContainerStarted","Data":"acb27357cfdb72a63057ecbbe687770179a335a20d286cfe70c22c2ced304a2a"} Dec 08 09:36:10 crc kubenswrapper[4776]: I1208 09:36:10.993427 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" podStartSLOduration=2.556533753 podStartE2EDuration="2.993391538s" podCreationTimestamp="2025-12-08 09:36:08 +0000 UTC" firstStartedPulling="2025-12-08 09:36:09.906622372 +0000 UTC m=+2246.169847394" lastFinishedPulling="2025-12-08 09:36:10.343480157 +0000 UTC m=+2246.606705179" observedRunningTime="2025-12-08 09:36:10.987452419 +0000 UTC m=+2247.250677501" watchObservedRunningTime="2025-12-08 09:36:10.993391538 +0000 UTC m=+2247.256616590" Dec 08 09:36:11 crc kubenswrapper[4776]: I1208 09:36:11.399568 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:36:11 crc kubenswrapper[4776]: I1208 09:36:11.400254 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:36:17 crc kubenswrapper[4776]: I1208 09:36:17.599857 4776 scope.go:117] "RemoveContainer" containerID="aa98e5947586fa94673e30b99b6340d6bed0a6f212e05e6655444f8f45cbe1af" Dec 08 09:36:17 crc kubenswrapper[4776]: I1208 09:36:17.657710 4776 scope.go:117] "RemoveContainer" containerID="bf7d40f85562b59617eff8f372ce1b5dc118aa06a87cd416ec83aa2c3a74f745" Dec 08 09:36:17 crc kubenswrapper[4776]: I1208 09:36:17.695161 4776 scope.go:117] "RemoveContainer" containerID="254e6b53d20f5019a28759d11c91d73a6eaf8a3751091df3d4e8bb0cb0912d0c" Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.289838 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zzm8z"] Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.294630 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.299949 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zzm8z"] Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.398786 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.399352 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.399457 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.400403 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.400548 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" gracePeriod=600 Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.449978 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04e721a7-201d-4837-9521-07a19b1c2f77-utilities\") pod \"community-operators-zzm8z\" (UID: \"04e721a7-201d-4837-9521-07a19b1c2f77\") " pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.450081 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fng7j\" (UniqueName: \"kubernetes.io/projected/04e721a7-201d-4837-9521-07a19b1c2f77-kube-api-access-fng7j\") pod \"community-operators-zzm8z\" (UID: \"04e721a7-201d-4837-9521-07a19b1c2f77\") " pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.450162 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04e721a7-201d-4837-9521-07a19b1c2f77-catalog-content\") pod \"community-operators-zzm8z\" (UID: \"04e721a7-201d-4837-9521-07a19b1c2f77\") " pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:41 crc kubenswrapper[4776]: E1208 09:36:41.520549 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.552580 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04e721a7-201d-4837-9521-07a19b1c2f77-utilities\") pod \"community-operators-zzm8z\" (UID: \"04e721a7-201d-4837-9521-07a19b1c2f77\") " pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.552668 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fng7j\" (UniqueName: \"kubernetes.io/projected/04e721a7-201d-4837-9521-07a19b1c2f77-kube-api-access-fng7j\") pod \"community-operators-zzm8z\" (UID: \"04e721a7-201d-4837-9521-07a19b1c2f77\") " pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.552744 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04e721a7-201d-4837-9521-07a19b1c2f77-catalog-content\") pod \"community-operators-zzm8z\" (UID: \"04e721a7-201d-4837-9521-07a19b1c2f77\") " pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.553250 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04e721a7-201d-4837-9521-07a19b1c2f77-catalog-content\") pod \"community-operators-zzm8z\" (UID: \"04e721a7-201d-4837-9521-07a19b1c2f77\") " pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.553703 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04e721a7-201d-4837-9521-07a19b1c2f77-utilities\") pod \"community-operators-zzm8z\" (UID: \"04e721a7-201d-4837-9521-07a19b1c2f77\") " pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.572521 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fng7j\" (UniqueName: \"kubernetes.io/projected/04e721a7-201d-4837-9521-07a19b1c2f77-kube-api-access-fng7j\") pod \"community-operators-zzm8z\" (UID: \"04e721a7-201d-4837-9521-07a19b1c2f77\") " pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:41 crc kubenswrapper[4776]: I1208 09:36:41.652738 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:42 crc kubenswrapper[4776]: I1208 09:36:42.178557 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zzm8z"] Dec 08 09:36:42 crc kubenswrapper[4776]: I1208 09:36:42.408696 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" exitCode=0 Dec 08 09:36:42 crc kubenswrapper[4776]: I1208 09:36:42.409123 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296"} Dec 08 09:36:42 crc kubenswrapper[4776]: I1208 09:36:42.409165 4776 scope.go:117] "RemoveContainer" containerID="7d8d2a2902024cbe40e4f1f7b98f5ca2ce52445ed3e9c1d7d0e196722965b4ad" Dec 08 09:36:42 crc kubenswrapper[4776]: I1208 09:36:42.410140 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:36:42 crc kubenswrapper[4776]: E1208 09:36:42.410588 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:36:42 crc kubenswrapper[4776]: I1208 09:36:42.415684 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzm8z" event={"ID":"04e721a7-201d-4837-9521-07a19b1c2f77","Type":"ContainerStarted","Data":"47e14eff7955e438075329d90e02636242713417056b35ff63dc0065adfe9dcc"} Dec 08 09:36:43 crc kubenswrapper[4776]: I1208 09:36:43.432091 4776 generic.go:334] "Generic (PLEG): container finished" podID="04e721a7-201d-4837-9521-07a19b1c2f77" containerID="3f4b28a46f7b9ea32d43fc601a3ad6bbfc0ba55acb23315810335e3f5ced8cf3" exitCode=0 Dec 08 09:36:43 crc kubenswrapper[4776]: I1208 09:36:43.432786 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzm8z" event={"ID":"04e721a7-201d-4837-9521-07a19b1c2f77","Type":"ContainerDied","Data":"3f4b28a46f7b9ea32d43fc601a3ad6bbfc0ba55acb23315810335e3f5ced8cf3"} Dec 08 09:36:44 crc kubenswrapper[4776]: I1208 09:36:44.449879 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzm8z" event={"ID":"04e721a7-201d-4837-9521-07a19b1c2f77","Type":"ContainerStarted","Data":"9ed50080efb9fa00e2b7fb7e4c1184a79d6bc2fad683c70cbbbc3121d37feb83"} Dec 08 09:36:45 crc kubenswrapper[4776]: I1208 09:36:45.460712 4776 generic.go:334] "Generic (PLEG): container finished" podID="04e721a7-201d-4837-9521-07a19b1c2f77" containerID="9ed50080efb9fa00e2b7fb7e4c1184a79d6bc2fad683c70cbbbc3121d37feb83" exitCode=0 Dec 08 09:36:45 crc kubenswrapper[4776]: I1208 09:36:45.460755 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzm8z" event={"ID":"04e721a7-201d-4837-9521-07a19b1c2f77","Type":"ContainerDied","Data":"9ed50080efb9fa00e2b7fb7e4c1184a79d6bc2fad683c70cbbbc3121d37feb83"} Dec 08 09:36:46 crc kubenswrapper[4776]: I1208 09:36:46.473165 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzm8z" event={"ID":"04e721a7-201d-4837-9521-07a19b1c2f77","Type":"ContainerStarted","Data":"cf5c8442d3fe3ac6f818e97dd7c3c13c912da0548abd44b57498ad9124797b3b"} Dec 08 09:36:46 crc kubenswrapper[4776]: I1208 09:36:46.503054 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zzm8z" podStartSLOduration=3.06615934 podStartE2EDuration="5.503034118s" podCreationTimestamp="2025-12-08 09:36:41 +0000 UTC" firstStartedPulling="2025-12-08 09:36:43.435401609 +0000 UTC m=+2279.698626631" lastFinishedPulling="2025-12-08 09:36:45.872276387 +0000 UTC m=+2282.135501409" observedRunningTime="2025-12-08 09:36:46.491161459 +0000 UTC m=+2282.754386481" watchObservedRunningTime="2025-12-08 09:36:46.503034118 +0000 UTC m=+2282.766259130" Dec 08 09:36:51 crc kubenswrapper[4776]: I1208 09:36:51.668505 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:51 crc kubenswrapper[4776]: I1208 09:36:51.673039 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:51 crc kubenswrapper[4776]: I1208 09:36:51.746974 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:52 crc kubenswrapper[4776]: I1208 09:36:52.765759 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:52 crc kubenswrapper[4776]: I1208 09:36:52.820762 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zzm8z"] Dec 08 09:36:54 crc kubenswrapper[4776]: I1208 09:36:54.722025 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zzm8z" podUID="04e721a7-201d-4837-9521-07a19b1c2f77" containerName="registry-server" containerID="cri-o://cf5c8442d3fe3ac6f818e97dd7c3c13c912da0548abd44b57498ad9124797b3b" gracePeriod=2 Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.717576 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.733967 4776 generic.go:334] "Generic (PLEG): container finished" podID="04e721a7-201d-4837-9521-07a19b1c2f77" containerID="cf5c8442d3fe3ac6f818e97dd7c3c13c912da0548abd44b57498ad9124797b3b" exitCode=0 Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.734021 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzm8z" event={"ID":"04e721a7-201d-4837-9521-07a19b1c2f77","Type":"ContainerDied","Data":"cf5c8442d3fe3ac6f818e97dd7c3c13c912da0548abd44b57498ad9124797b3b"} Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.734061 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzm8z" event={"ID":"04e721a7-201d-4837-9521-07a19b1c2f77","Type":"ContainerDied","Data":"47e14eff7955e438075329d90e02636242713417056b35ff63dc0065adfe9dcc"} Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.734083 4776 scope.go:117] "RemoveContainer" containerID="cf5c8442d3fe3ac6f818e97dd7c3c13c912da0548abd44b57498ad9124797b3b" Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.734085 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zzm8z" Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.758848 4776 scope.go:117] "RemoveContainer" containerID="9ed50080efb9fa00e2b7fb7e4c1184a79d6bc2fad683c70cbbbc3121d37feb83" Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.770911 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04e721a7-201d-4837-9521-07a19b1c2f77-utilities\") pod \"04e721a7-201d-4837-9521-07a19b1c2f77\" (UID: \"04e721a7-201d-4837-9521-07a19b1c2f77\") " Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.770995 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fng7j\" (UniqueName: \"kubernetes.io/projected/04e721a7-201d-4837-9521-07a19b1c2f77-kube-api-access-fng7j\") pod \"04e721a7-201d-4837-9521-07a19b1c2f77\" (UID: \"04e721a7-201d-4837-9521-07a19b1c2f77\") " Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.771087 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04e721a7-201d-4837-9521-07a19b1c2f77-catalog-content\") pod \"04e721a7-201d-4837-9521-07a19b1c2f77\" (UID: \"04e721a7-201d-4837-9521-07a19b1c2f77\") " Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.772475 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04e721a7-201d-4837-9521-07a19b1c2f77-utilities" (OuterVolumeSpecName: "utilities") pod "04e721a7-201d-4837-9521-07a19b1c2f77" (UID: "04e721a7-201d-4837-9521-07a19b1c2f77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.779537 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e721a7-201d-4837-9521-07a19b1c2f77-kube-api-access-fng7j" (OuterVolumeSpecName: "kube-api-access-fng7j") pod "04e721a7-201d-4837-9521-07a19b1c2f77" (UID: "04e721a7-201d-4837-9521-07a19b1c2f77"). InnerVolumeSpecName "kube-api-access-fng7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.791414 4776 scope.go:117] "RemoveContainer" containerID="3f4b28a46f7b9ea32d43fc601a3ad6bbfc0ba55acb23315810335e3f5ced8cf3" Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.815899 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04e721a7-201d-4837-9521-07a19b1c2f77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04e721a7-201d-4837-9521-07a19b1c2f77" (UID: "04e721a7-201d-4837-9521-07a19b1c2f77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.873658 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04e721a7-201d-4837-9521-07a19b1c2f77-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.873685 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fng7j\" (UniqueName: \"kubernetes.io/projected/04e721a7-201d-4837-9521-07a19b1c2f77-kube-api-access-fng7j\") on node \"crc\" DevicePath \"\"" Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.873696 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04e721a7-201d-4837-9521-07a19b1c2f77-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.906089 4776 scope.go:117] "RemoveContainer" containerID="cf5c8442d3fe3ac6f818e97dd7c3c13c912da0548abd44b57498ad9124797b3b" Dec 08 09:36:55 crc kubenswrapper[4776]: E1208 09:36:55.906425 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf5c8442d3fe3ac6f818e97dd7c3c13c912da0548abd44b57498ad9124797b3b\": container with ID starting with cf5c8442d3fe3ac6f818e97dd7c3c13c912da0548abd44b57498ad9124797b3b not found: ID does not exist" containerID="cf5c8442d3fe3ac6f818e97dd7c3c13c912da0548abd44b57498ad9124797b3b" Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.906454 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf5c8442d3fe3ac6f818e97dd7c3c13c912da0548abd44b57498ad9124797b3b"} err="failed to get container status \"cf5c8442d3fe3ac6f818e97dd7c3c13c912da0548abd44b57498ad9124797b3b\": rpc error: code = NotFound desc = could not find container \"cf5c8442d3fe3ac6f818e97dd7c3c13c912da0548abd44b57498ad9124797b3b\": container with ID starting with cf5c8442d3fe3ac6f818e97dd7c3c13c912da0548abd44b57498ad9124797b3b not found: ID does not exist" Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.906473 4776 scope.go:117] "RemoveContainer" containerID="9ed50080efb9fa00e2b7fb7e4c1184a79d6bc2fad683c70cbbbc3121d37feb83" Dec 08 09:36:55 crc kubenswrapper[4776]: E1208 09:36:55.906655 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ed50080efb9fa00e2b7fb7e4c1184a79d6bc2fad683c70cbbbc3121d37feb83\": container with ID starting with 9ed50080efb9fa00e2b7fb7e4c1184a79d6bc2fad683c70cbbbc3121d37feb83 not found: ID does not exist" containerID="9ed50080efb9fa00e2b7fb7e4c1184a79d6bc2fad683c70cbbbc3121d37feb83" Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.906671 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed50080efb9fa00e2b7fb7e4c1184a79d6bc2fad683c70cbbbc3121d37feb83"} err="failed to get container status \"9ed50080efb9fa00e2b7fb7e4c1184a79d6bc2fad683c70cbbbc3121d37feb83\": rpc error: code = NotFound desc = could not find container \"9ed50080efb9fa00e2b7fb7e4c1184a79d6bc2fad683c70cbbbc3121d37feb83\": container with ID starting with 9ed50080efb9fa00e2b7fb7e4c1184a79d6bc2fad683c70cbbbc3121d37feb83 not found: ID does not exist" Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.906683 4776 scope.go:117] "RemoveContainer" containerID="3f4b28a46f7b9ea32d43fc601a3ad6bbfc0ba55acb23315810335e3f5ced8cf3" Dec 08 09:36:55 crc kubenswrapper[4776]: E1208 09:36:55.906943 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4b28a46f7b9ea32d43fc601a3ad6bbfc0ba55acb23315810335e3f5ced8cf3\": container with ID starting with 3f4b28a46f7b9ea32d43fc601a3ad6bbfc0ba55acb23315810335e3f5ced8cf3 not found: ID does not exist" containerID="3f4b28a46f7b9ea32d43fc601a3ad6bbfc0ba55acb23315810335e3f5ced8cf3" Dec 08 09:36:55 crc kubenswrapper[4776]: I1208 09:36:55.906990 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4b28a46f7b9ea32d43fc601a3ad6bbfc0ba55acb23315810335e3f5ced8cf3"} err="failed to get container status \"3f4b28a46f7b9ea32d43fc601a3ad6bbfc0ba55acb23315810335e3f5ced8cf3\": rpc error: code = NotFound desc = could not find container \"3f4b28a46f7b9ea32d43fc601a3ad6bbfc0ba55acb23315810335e3f5ced8cf3\": container with ID starting with 3f4b28a46f7b9ea32d43fc601a3ad6bbfc0ba55acb23315810335e3f5ced8cf3 not found: ID does not exist" Dec 08 09:36:56 crc kubenswrapper[4776]: I1208 09:36:56.073198 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zzm8z"] Dec 08 09:36:56 crc kubenswrapper[4776]: I1208 09:36:56.081718 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zzm8z"] Dec 08 09:36:56 crc kubenswrapper[4776]: I1208 09:36:56.344725 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:36:56 crc kubenswrapper[4776]: E1208 09:36:56.345669 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:36:56 crc kubenswrapper[4776]: I1208 09:36:56.356764 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04e721a7-201d-4837-9521-07a19b1c2f77" path="/var/lib/kubelet/pods/04e721a7-201d-4837-9521-07a19b1c2f77/volumes" Dec 08 09:36:59 crc kubenswrapper[4776]: I1208 09:36:59.774002 4776 generic.go:334] "Generic (PLEG): container finished" podID="c92123e3-056d-4e4f-83b1-3cf335342a70" containerID="acb27357cfdb72a63057ecbbe687770179a335a20d286cfe70c22c2ced304a2a" exitCode=0 Dec 08 09:36:59 crc kubenswrapper[4776]: I1208 09:36:59.774102 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" event={"ID":"c92123e3-056d-4e4f-83b1-3cf335342a70","Type":"ContainerDied","Data":"acb27357cfdb72a63057ecbbe687770179a335a20d286cfe70c22c2ced304a2a"} Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.236848 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.289367 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c92123e3-056d-4e4f-83b1-3cf335342a70-inventory\") pod \"c92123e3-056d-4e4f-83b1-3cf335342a70\" (UID: \"c92123e3-056d-4e4f-83b1-3cf335342a70\") " Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.289496 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shndg\" (UniqueName: \"kubernetes.io/projected/c92123e3-056d-4e4f-83b1-3cf335342a70-kube-api-access-shndg\") pod \"c92123e3-056d-4e4f-83b1-3cf335342a70\" (UID: \"c92123e3-056d-4e4f-83b1-3cf335342a70\") " Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.289675 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c92123e3-056d-4e4f-83b1-3cf335342a70-ssh-key\") pod \"c92123e3-056d-4e4f-83b1-3cf335342a70\" (UID: \"c92123e3-056d-4e4f-83b1-3cf335342a70\") " Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.295131 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c92123e3-056d-4e4f-83b1-3cf335342a70-kube-api-access-shndg" (OuterVolumeSpecName: "kube-api-access-shndg") pod "c92123e3-056d-4e4f-83b1-3cf335342a70" (UID: "c92123e3-056d-4e4f-83b1-3cf335342a70"). InnerVolumeSpecName "kube-api-access-shndg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.320052 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c92123e3-056d-4e4f-83b1-3cf335342a70-inventory" (OuterVolumeSpecName: "inventory") pod "c92123e3-056d-4e4f-83b1-3cf335342a70" (UID: "c92123e3-056d-4e4f-83b1-3cf335342a70"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.331598 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c92123e3-056d-4e4f-83b1-3cf335342a70-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c92123e3-056d-4e4f-83b1-3cf335342a70" (UID: "c92123e3-056d-4e4f-83b1-3cf335342a70"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.391560 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c92123e3-056d-4e4f-83b1-3cf335342a70-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.391762 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c92123e3-056d-4e4f-83b1-3cf335342a70-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.391835 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shndg\" (UniqueName: \"kubernetes.io/projected/c92123e3-056d-4e4f-83b1-3cf335342a70-kube-api-access-shndg\") on node \"crc\" DevicePath \"\"" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.800476 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" event={"ID":"c92123e3-056d-4e4f-83b1-3cf335342a70","Type":"ContainerDied","Data":"74350d73644bbfd319c2b4e66328f878a70d1a5237f57c2e73d83e6d2d772b32"} Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.800526 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74350d73644bbfd319c2b4e66328f878a70d1a5237f57c2e73d83e6d2d772b32" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.800576 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.959484 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cxgqb"] Dec 08 09:37:01 crc kubenswrapper[4776]: E1208 09:37:01.960040 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e721a7-201d-4837-9521-07a19b1c2f77" containerName="extract-utilities" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.960061 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e721a7-201d-4837-9521-07a19b1c2f77" containerName="extract-utilities" Dec 08 09:37:01 crc kubenswrapper[4776]: E1208 09:37:01.960108 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e721a7-201d-4837-9521-07a19b1c2f77" containerName="registry-server" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.960116 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e721a7-201d-4837-9521-07a19b1c2f77" containerName="registry-server" Dec 08 09:37:01 crc kubenswrapper[4776]: E1208 09:37:01.960133 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e721a7-201d-4837-9521-07a19b1c2f77" containerName="extract-content" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.960139 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e721a7-201d-4837-9521-07a19b1c2f77" containerName="extract-content" Dec 08 09:37:01 crc kubenswrapper[4776]: E1208 09:37:01.960158 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c92123e3-056d-4e4f-83b1-3cf335342a70" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.960165 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92123e3-056d-4e4f-83b1-3cf335342a70" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.960400 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="04e721a7-201d-4837-9521-07a19b1c2f77" containerName="registry-server" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.960418 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c92123e3-056d-4e4f-83b1-3cf335342a70" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.961293 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.963443 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.963809 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.968574 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.968658 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:37:01 crc kubenswrapper[4776]: I1208 09:37:01.974487 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cxgqb"] Dec 08 09:37:02 crc kubenswrapper[4776]: I1208 09:37:02.006635 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/60899add-1d95-4fad-8cee-852951046a90-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cxgqb\" (UID: \"60899add-1d95-4fad-8cee-852951046a90\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" Dec 08 09:37:02 crc kubenswrapper[4776]: I1208 09:37:02.006878 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60899add-1d95-4fad-8cee-852951046a90-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cxgqb\" (UID: \"60899add-1d95-4fad-8cee-852951046a90\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" Dec 08 09:37:02 crc kubenswrapper[4776]: I1208 09:37:02.006925 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szbkq\" (UniqueName: \"kubernetes.io/projected/60899add-1d95-4fad-8cee-852951046a90-kube-api-access-szbkq\") pod \"ssh-known-hosts-edpm-deployment-cxgqb\" (UID: \"60899add-1d95-4fad-8cee-852951046a90\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" Dec 08 09:37:02 crc kubenswrapper[4776]: I1208 09:37:02.108729 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60899add-1d95-4fad-8cee-852951046a90-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cxgqb\" (UID: \"60899add-1d95-4fad-8cee-852951046a90\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" Dec 08 09:37:02 crc kubenswrapper[4776]: I1208 09:37:02.108775 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szbkq\" (UniqueName: \"kubernetes.io/projected/60899add-1d95-4fad-8cee-852951046a90-kube-api-access-szbkq\") pod \"ssh-known-hosts-edpm-deployment-cxgqb\" (UID: \"60899add-1d95-4fad-8cee-852951046a90\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" Dec 08 09:37:02 crc kubenswrapper[4776]: I1208 09:37:02.108892 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/60899add-1d95-4fad-8cee-852951046a90-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cxgqb\" (UID: \"60899add-1d95-4fad-8cee-852951046a90\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" Dec 08 09:37:02 crc kubenswrapper[4776]: I1208 09:37:02.113893 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60899add-1d95-4fad-8cee-852951046a90-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cxgqb\" (UID: \"60899add-1d95-4fad-8cee-852951046a90\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" Dec 08 09:37:02 crc kubenswrapper[4776]: I1208 09:37:02.114744 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/60899add-1d95-4fad-8cee-852951046a90-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cxgqb\" (UID: \"60899add-1d95-4fad-8cee-852951046a90\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" Dec 08 09:37:02 crc kubenswrapper[4776]: I1208 09:37:02.132934 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szbkq\" (UniqueName: \"kubernetes.io/projected/60899add-1d95-4fad-8cee-852951046a90-kube-api-access-szbkq\") pod \"ssh-known-hosts-edpm-deployment-cxgqb\" (UID: \"60899add-1d95-4fad-8cee-852951046a90\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" Dec 08 09:37:02 crc kubenswrapper[4776]: I1208 09:37:02.295102 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" Dec 08 09:37:02 crc kubenswrapper[4776]: W1208 09:37:02.884714 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60899add_1d95_4fad_8cee_852951046a90.slice/crio-e0ccebdcd00f5b4f88c054f5f1a351514d047e5825673866129f85423cc62674 WatchSource:0}: Error finding container e0ccebdcd00f5b4f88c054f5f1a351514d047e5825673866129f85423cc62674: Status 404 returned error can't find the container with id e0ccebdcd00f5b4f88c054f5f1a351514d047e5825673866129f85423cc62674 Dec 08 09:37:02 crc kubenswrapper[4776]: I1208 09:37:02.891569 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cxgqb"] Dec 08 09:37:03 crc kubenswrapper[4776]: I1208 09:37:03.821034 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" event={"ID":"60899add-1d95-4fad-8cee-852951046a90","Type":"ContainerStarted","Data":"e0ccebdcd00f5b4f88c054f5f1a351514d047e5825673866129f85423cc62674"} Dec 08 09:37:04 crc kubenswrapper[4776]: I1208 09:37:04.834277 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" event={"ID":"60899add-1d95-4fad-8cee-852951046a90","Type":"ContainerStarted","Data":"bfaa5ab91941044c80e08410b6ef25c05ce57c4656bd42059c62fd5d3bb58ddf"} Dec 08 09:37:04 crc kubenswrapper[4776]: I1208 09:37:04.863969 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" podStartSLOduration=2.901613699 podStartE2EDuration="3.863947481s" podCreationTimestamp="2025-12-08 09:37:01 +0000 UTC" firstStartedPulling="2025-12-08 09:37:02.887123303 +0000 UTC m=+2299.150348325" lastFinishedPulling="2025-12-08 09:37:03.849457085 +0000 UTC m=+2300.112682107" observedRunningTime="2025-12-08 09:37:04.848763113 +0000 UTC m=+2301.111988175" watchObservedRunningTime="2025-12-08 09:37:04.863947481 +0000 UTC m=+2301.127172513" Dec 08 09:37:09 crc kubenswrapper[4776]: I1208 09:37:09.344153 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:37:09 crc kubenswrapper[4776]: E1208 09:37:09.345755 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:37:10 crc kubenswrapper[4776]: I1208 09:37:10.900677 4776 generic.go:334] "Generic (PLEG): container finished" podID="60899add-1d95-4fad-8cee-852951046a90" containerID="bfaa5ab91941044c80e08410b6ef25c05ce57c4656bd42059c62fd5d3bb58ddf" exitCode=0 Dec 08 09:37:10 crc kubenswrapper[4776]: I1208 09:37:10.900776 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" event={"ID":"60899add-1d95-4fad-8cee-852951046a90","Type":"ContainerDied","Data":"bfaa5ab91941044c80e08410b6ef25c05ce57c4656bd42059c62fd5d3bb58ddf"} Dec 08 09:37:12 crc kubenswrapper[4776]: I1208 09:37:12.383206 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" Dec 08 09:37:12 crc kubenswrapper[4776]: I1208 09:37:12.493346 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szbkq\" (UniqueName: \"kubernetes.io/projected/60899add-1d95-4fad-8cee-852951046a90-kube-api-access-szbkq\") pod \"60899add-1d95-4fad-8cee-852951046a90\" (UID: \"60899add-1d95-4fad-8cee-852951046a90\") " Dec 08 09:37:12 crc kubenswrapper[4776]: I1208 09:37:12.493591 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/60899add-1d95-4fad-8cee-852951046a90-inventory-0\") pod \"60899add-1d95-4fad-8cee-852951046a90\" (UID: \"60899add-1d95-4fad-8cee-852951046a90\") " Dec 08 09:37:12 crc kubenswrapper[4776]: I1208 09:37:12.493623 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60899add-1d95-4fad-8cee-852951046a90-ssh-key-openstack-edpm-ipam\") pod \"60899add-1d95-4fad-8cee-852951046a90\" (UID: \"60899add-1d95-4fad-8cee-852951046a90\") " Dec 08 09:37:12 crc kubenswrapper[4776]: I1208 09:37:12.507391 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60899add-1d95-4fad-8cee-852951046a90-kube-api-access-szbkq" (OuterVolumeSpecName: "kube-api-access-szbkq") pod "60899add-1d95-4fad-8cee-852951046a90" (UID: "60899add-1d95-4fad-8cee-852951046a90"). InnerVolumeSpecName "kube-api-access-szbkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:37:12 crc kubenswrapper[4776]: I1208 09:37:12.528202 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60899add-1d95-4fad-8cee-852951046a90-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "60899add-1d95-4fad-8cee-852951046a90" (UID: "60899add-1d95-4fad-8cee-852951046a90"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:37:12 crc kubenswrapper[4776]: I1208 09:37:12.530339 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60899add-1d95-4fad-8cee-852951046a90-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "60899add-1d95-4fad-8cee-852951046a90" (UID: "60899add-1d95-4fad-8cee-852951046a90"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:37:12 crc kubenswrapper[4776]: I1208 09:37:12.597499 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szbkq\" (UniqueName: \"kubernetes.io/projected/60899add-1d95-4fad-8cee-852951046a90-kube-api-access-szbkq\") on node \"crc\" DevicePath \"\"" Dec 08 09:37:12 crc kubenswrapper[4776]: I1208 09:37:12.597828 4776 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/60899add-1d95-4fad-8cee-852951046a90-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:37:12 crc kubenswrapper[4776]: I1208 09:37:12.597839 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60899add-1d95-4fad-8cee-852951046a90-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 08 09:37:12 crc kubenswrapper[4776]: I1208 09:37:12.922508 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" event={"ID":"60899add-1d95-4fad-8cee-852951046a90","Type":"ContainerDied","Data":"e0ccebdcd00f5b4f88c054f5f1a351514d047e5825673866129f85423cc62674"} Dec 08 09:37:12 crc kubenswrapper[4776]: I1208 09:37:12.922553 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cxgqb" Dec 08 09:37:12 crc kubenswrapper[4776]: I1208 09:37:12.922560 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0ccebdcd00f5b4f88c054f5f1a351514d047e5825673866129f85423cc62674" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.009721 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln"] Dec 08 09:37:13 crc kubenswrapper[4776]: E1208 09:37:13.010345 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60899add-1d95-4fad-8cee-852951046a90" containerName="ssh-known-hosts-edpm-deployment" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.010364 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="60899add-1d95-4fad-8cee-852951046a90" containerName="ssh-known-hosts-edpm-deployment" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.010679 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="60899add-1d95-4fad-8cee-852951046a90" containerName="ssh-known-hosts-edpm-deployment" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.011686 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.017353 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.017531 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.017675 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.017895 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.027209 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln"] Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.109477 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgx9g\" (UniqueName: \"kubernetes.io/projected/31f822a4-fa31-4cae-b24f-a1c1395caf05-kube-api-access-xgx9g\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-97xln\" (UID: \"31f822a4-fa31-4cae-b24f-a1c1395caf05\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.109659 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31f822a4-fa31-4cae-b24f-a1c1395caf05-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-97xln\" (UID: \"31f822a4-fa31-4cae-b24f-a1c1395caf05\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.109693 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31f822a4-fa31-4cae-b24f-a1c1395caf05-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-97xln\" (UID: \"31f822a4-fa31-4cae-b24f-a1c1395caf05\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.211320 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgx9g\" (UniqueName: \"kubernetes.io/projected/31f822a4-fa31-4cae-b24f-a1c1395caf05-kube-api-access-xgx9g\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-97xln\" (UID: \"31f822a4-fa31-4cae-b24f-a1c1395caf05\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.211493 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31f822a4-fa31-4cae-b24f-a1c1395caf05-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-97xln\" (UID: \"31f822a4-fa31-4cae-b24f-a1c1395caf05\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.211524 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31f822a4-fa31-4cae-b24f-a1c1395caf05-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-97xln\" (UID: \"31f822a4-fa31-4cae-b24f-a1c1395caf05\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.215703 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31f822a4-fa31-4cae-b24f-a1c1395caf05-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-97xln\" (UID: \"31f822a4-fa31-4cae-b24f-a1c1395caf05\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.216455 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31f822a4-fa31-4cae-b24f-a1c1395caf05-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-97xln\" (UID: \"31f822a4-fa31-4cae-b24f-a1c1395caf05\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.227574 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgx9g\" (UniqueName: \"kubernetes.io/projected/31f822a4-fa31-4cae-b24f-a1c1395caf05-kube-api-access-xgx9g\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-97xln\" (UID: \"31f822a4-fa31-4cae-b24f-a1c1395caf05\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.342456 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.942352 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln"] Dec 08 09:37:13 crc kubenswrapper[4776]: I1208 09:37:13.969429 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" event={"ID":"31f822a4-fa31-4cae-b24f-a1c1395caf05","Type":"ContainerStarted","Data":"ce955c6ecd1a128c4a544d286ddd71c37ff26a1609f2d9eefb4956099105ae5b"} Dec 08 09:37:14 crc kubenswrapper[4776]: I1208 09:37:14.981047 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" event={"ID":"31f822a4-fa31-4cae-b24f-a1c1395caf05","Type":"ContainerStarted","Data":"190f3772e04c568569be7f4d9c96cd6b4f12f6ca34eebca3d42d86ee170c4274"} Dec 08 09:37:15 crc kubenswrapper[4776]: I1208 09:37:15.012197 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" podStartSLOduration=2.6135797480000003 podStartE2EDuration="3.012158202s" podCreationTimestamp="2025-12-08 09:37:12 +0000 UTC" firstStartedPulling="2025-12-08 09:37:13.922537945 +0000 UTC m=+2310.185762977" lastFinishedPulling="2025-12-08 09:37:14.321116409 +0000 UTC m=+2310.584341431" observedRunningTime="2025-12-08 09:37:15.002088731 +0000 UTC m=+2311.265313773" watchObservedRunningTime="2025-12-08 09:37:15.012158202 +0000 UTC m=+2311.275383224" Dec 08 09:37:23 crc kubenswrapper[4776]: I1208 09:37:23.066092 4776 generic.go:334] "Generic (PLEG): container finished" podID="31f822a4-fa31-4cae-b24f-a1c1395caf05" containerID="190f3772e04c568569be7f4d9c96cd6b4f12f6ca34eebca3d42d86ee170c4274" exitCode=0 Dec 08 09:37:23 crc kubenswrapper[4776]: I1208 09:37:23.066181 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" event={"ID":"31f822a4-fa31-4cae-b24f-a1c1395caf05","Type":"ContainerDied","Data":"190f3772e04c568569be7f4d9c96cd6b4f12f6ca34eebca3d42d86ee170c4274"} Dec 08 09:37:24 crc kubenswrapper[4776]: I1208 09:37:24.350984 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:37:24 crc kubenswrapper[4776]: E1208 09:37:24.352311 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:37:25 crc kubenswrapper[4776]: I1208 09:37:25.408702 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" Dec 08 09:37:25 crc kubenswrapper[4776]: I1208 09:37:25.498796 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31f822a4-fa31-4cae-b24f-a1c1395caf05-ssh-key\") pod \"31f822a4-fa31-4cae-b24f-a1c1395caf05\" (UID: \"31f822a4-fa31-4cae-b24f-a1c1395caf05\") " Dec 08 09:37:25 crc kubenswrapper[4776]: I1208 09:37:25.498984 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgx9g\" (UniqueName: \"kubernetes.io/projected/31f822a4-fa31-4cae-b24f-a1c1395caf05-kube-api-access-xgx9g\") pod \"31f822a4-fa31-4cae-b24f-a1c1395caf05\" (UID: \"31f822a4-fa31-4cae-b24f-a1c1395caf05\") " Dec 08 09:37:25 crc kubenswrapper[4776]: I1208 09:37:25.499149 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31f822a4-fa31-4cae-b24f-a1c1395caf05-inventory\") pod \"31f822a4-fa31-4cae-b24f-a1c1395caf05\" (UID: \"31f822a4-fa31-4cae-b24f-a1c1395caf05\") " Dec 08 09:37:25 crc kubenswrapper[4776]: I1208 09:37:25.512213 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31f822a4-fa31-4cae-b24f-a1c1395caf05-kube-api-access-xgx9g" (OuterVolumeSpecName: "kube-api-access-xgx9g") pod "31f822a4-fa31-4cae-b24f-a1c1395caf05" (UID: "31f822a4-fa31-4cae-b24f-a1c1395caf05"). InnerVolumeSpecName "kube-api-access-xgx9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:37:25 crc kubenswrapper[4776]: I1208 09:37:25.535922 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f822a4-fa31-4cae-b24f-a1c1395caf05-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "31f822a4-fa31-4cae-b24f-a1c1395caf05" (UID: "31f822a4-fa31-4cae-b24f-a1c1395caf05"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:37:25 crc kubenswrapper[4776]: I1208 09:37:25.536469 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f822a4-fa31-4cae-b24f-a1c1395caf05-inventory" (OuterVolumeSpecName: "inventory") pod "31f822a4-fa31-4cae-b24f-a1c1395caf05" (UID: "31f822a4-fa31-4cae-b24f-a1c1395caf05"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:37:25 crc kubenswrapper[4776]: I1208 09:37:25.601960 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31f822a4-fa31-4cae-b24f-a1c1395caf05-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:37:25 crc kubenswrapper[4776]: I1208 09:37:25.602003 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31f822a4-fa31-4cae-b24f-a1c1395caf05-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:37:25 crc kubenswrapper[4776]: I1208 09:37:25.602018 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgx9g\" (UniqueName: \"kubernetes.io/projected/31f822a4-fa31-4cae-b24f-a1c1395caf05-kube-api-access-xgx9g\") on node \"crc\" DevicePath \"\"" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.094636 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" event={"ID":"31f822a4-fa31-4cae-b24f-a1c1395caf05","Type":"ContainerDied","Data":"ce955c6ecd1a128c4a544d286ddd71c37ff26a1609f2d9eefb4956099105ae5b"} Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.094675 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce955c6ecd1a128c4a544d286ddd71c37ff26a1609f2d9eefb4956099105ae5b" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.094727 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-97xln" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.509994 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n"] Dec 08 09:37:26 crc kubenswrapper[4776]: E1208 09:37:26.510771 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f822a4-fa31-4cae-b24f-a1c1395caf05" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.510789 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f822a4-fa31-4cae-b24f-a1c1395caf05" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.511326 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="31f822a4-fa31-4cae-b24f-a1c1395caf05" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.512240 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.514497 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.514749 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.515543 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.515777 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.530508 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n"] Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.630505 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7g5g\" (UniqueName: \"kubernetes.io/projected/26d6a987-fa87-4870-97f8-30aa5b38b753-kube-api-access-w7g5g\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n\" (UID: \"26d6a987-fa87-4870-97f8-30aa5b38b753\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.630559 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26d6a987-fa87-4870-97f8-30aa5b38b753-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n\" (UID: \"26d6a987-fa87-4870-97f8-30aa5b38b753\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.631162 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26d6a987-fa87-4870-97f8-30aa5b38b753-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n\" (UID: \"26d6a987-fa87-4870-97f8-30aa5b38b753\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.734862 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26d6a987-fa87-4870-97f8-30aa5b38b753-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n\" (UID: \"26d6a987-fa87-4870-97f8-30aa5b38b753\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.736344 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7g5g\" (UniqueName: \"kubernetes.io/projected/26d6a987-fa87-4870-97f8-30aa5b38b753-kube-api-access-w7g5g\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n\" (UID: \"26d6a987-fa87-4870-97f8-30aa5b38b753\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.736448 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26d6a987-fa87-4870-97f8-30aa5b38b753-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n\" (UID: \"26d6a987-fa87-4870-97f8-30aa5b38b753\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.743679 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26d6a987-fa87-4870-97f8-30aa5b38b753-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n\" (UID: \"26d6a987-fa87-4870-97f8-30aa5b38b753\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.755003 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26d6a987-fa87-4870-97f8-30aa5b38b753-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n\" (UID: \"26d6a987-fa87-4870-97f8-30aa5b38b753\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.759194 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7g5g\" (UniqueName: \"kubernetes.io/projected/26d6a987-fa87-4870-97f8-30aa5b38b753-kube-api-access-w7g5g\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n\" (UID: \"26d6a987-fa87-4870-97f8-30aa5b38b753\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" Dec 08 09:37:26 crc kubenswrapper[4776]: I1208 09:37:26.847360 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" Dec 08 09:37:27 crc kubenswrapper[4776]: I1208 09:37:27.479366 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n"] Dec 08 09:37:27 crc kubenswrapper[4776]: I1208 09:37:27.482847 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 09:37:28 crc kubenswrapper[4776]: I1208 09:37:28.136338 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" event={"ID":"26d6a987-fa87-4870-97f8-30aa5b38b753","Type":"ContainerStarted","Data":"4f438485898ecd07b23668d07b0a7109dda703071764886371356b3bd498dd68"} Dec 08 09:37:29 crc kubenswrapper[4776]: I1208 09:37:29.147841 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" event={"ID":"26d6a987-fa87-4870-97f8-30aa5b38b753","Type":"ContainerStarted","Data":"be9b8dea444f2279d00725c05d93d5f4164bda4cad0daf32ac26f7514ef1a8be"} Dec 08 09:37:29 crc kubenswrapper[4776]: I1208 09:37:29.169757 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" podStartSLOduration=2.731054587 podStartE2EDuration="3.16973906s" podCreationTimestamp="2025-12-08 09:37:26 +0000 UTC" firstStartedPulling="2025-12-08 09:37:27.482662477 +0000 UTC m=+2323.745887489" lastFinishedPulling="2025-12-08 09:37:27.92134694 +0000 UTC m=+2324.184571962" observedRunningTime="2025-12-08 09:37:29.162962347 +0000 UTC m=+2325.426187389" watchObservedRunningTime="2025-12-08 09:37:29.16973906 +0000 UTC m=+2325.432964082" Dec 08 09:37:32 crc kubenswrapper[4776]: I1208 09:37:32.050074 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-l6w2n"] Dec 08 09:37:32 crc kubenswrapper[4776]: I1208 09:37:32.081982 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-l6w2n"] Dec 08 09:37:32 crc kubenswrapper[4776]: I1208 09:37:32.355633 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd51a3a4-205b-4844-81db-439c7e1f0624" path="/var/lib/kubelet/pods/cd51a3a4-205b-4844-81db-439c7e1f0624/volumes" Dec 08 09:37:37 crc kubenswrapper[4776]: I1208 09:37:37.344739 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:37:37 crc kubenswrapper[4776]: E1208 09:37:37.345426 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:37:38 crc kubenswrapper[4776]: I1208 09:37:38.256916 4776 generic.go:334] "Generic (PLEG): container finished" podID="26d6a987-fa87-4870-97f8-30aa5b38b753" containerID="be9b8dea444f2279d00725c05d93d5f4164bda4cad0daf32ac26f7514ef1a8be" exitCode=0 Dec 08 09:37:38 crc kubenswrapper[4776]: I1208 09:37:38.256998 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" event={"ID":"26d6a987-fa87-4870-97f8-30aa5b38b753","Type":"ContainerDied","Data":"be9b8dea444f2279d00725c05d93d5f4164bda4cad0daf32ac26f7514ef1a8be"} Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:39.775348 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:39.871261 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7g5g\" (UniqueName: \"kubernetes.io/projected/26d6a987-fa87-4870-97f8-30aa5b38b753-kube-api-access-w7g5g\") pod \"26d6a987-fa87-4870-97f8-30aa5b38b753\" (UID: \"26d6a987-fa87-4870-97f8-30aa5b38b753\") " Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:39.871635 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26d6a987-fa87-4870-97f8-30aa5b38b753-ssh-key\") pod \"26d6a987-fa87-4870-97f8-30aa5b38b753\" (UID: \"26d6a987-fa87-4870-97f8-30aa5b38b753\") " Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:39.871793 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26d6a987-fa87-4870-97f8-30aa5b38b753-inventory\") pod \"26d6a987-fa87-4870-97f8-30aa5b38b753\" (UID: \"26d6a987-fa87-4870-97f8-30aa5b38b753\") " Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:39.877279 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d6a987-fa87-4870-97f8-30aa5b38b753-kube-api-access-w7g5g" (OuterVolumeSpecName: "kube-api-access-w7g5g") pod "26d6a987-fa87-4870-97f8-30aa5b38b753" (UID: "26d6a987-fa87-4870-97f8-30aa5b38b753"). InnerVolumeSpecName "kube-api-access-w7g5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:39.913356 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d6a987-fa87-4870-97f8-30aa5b38b753-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "26d6a987-fa87-4870-97f8-30aa5b38b753" (UID: "26d6a987-fa87-4870-97f8-30aa5b38b753"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:39.917977 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d6a987-fa87-4870-97f8-30aa5b38b753-inventory" (OuterVolumeSpecName: "inventory") pod "26d6a987-fa87-4870-97f8-30aa5b38b753" (UID: "26d6a987-fa87-4870-97f8-30aa5b38b753"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:39.976919 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26d6a987-fa87-4870-97f8-30aa5b38b753-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:39.976961 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7g5g\" (UniqueName: \"kubernetes.io/projected/26d6a987-fa87-4870-97f8-30aa5b38b753-kube-api-access-w7g5g\") on node \"crc\" DevicePath \"\"" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:39.976976 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26d6a987-fa87-4870-97f8-30aa5b38b753-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.279825 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" event={"ID":"26d6a987-fa87-4870-97f8-30aa5b38b753","Type":"ContainerDied","Data":"4f438485898ecd07b23668d07b0a7109dda703071764886371356b3bd498dd68"} Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.279860 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f438485898ecd07b23668d07b0a7109dda703071764886371356b3bd498dd68" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.279927 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.362675 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml"] Dec 08 09:37:40 crc kubenswrapper[4776]: E1208 09:37:40.363294 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d6a987-fa87-4870-97f8-30aa5b38b753" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.363316 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d6a987-fa87-4870-97f8-30aa5b38b753" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.363642 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d6a987-fa87-4870-97f8-30aa5b38b753" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.364691 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.369567 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.369649 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.369756 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.369773 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.369793 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.369810 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.369975 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.370130 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.370267 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.382028 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml"] Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.511428 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.511493 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.511560 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.511589 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.511641 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.511786 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.511919 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.512119 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.512184 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.512213 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.512256 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.512306 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.512329 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw6b9\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-kube-api-access-pw6b9\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.512464 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.512509 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.512546 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.614452 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.614528 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.614564 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.614639 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.614662 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.614679 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.614699 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.614722 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.614738 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw6b9\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-kube-api-access-pw6b9\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.614788 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.614808 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.614826 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.614871 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.614891 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.614928 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.614945 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.627812 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.627929 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.628301 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.628508 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.637431 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.638146 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.640021 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.644894 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.645322 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.646016 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.649773 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.649916 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.650279 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.650562 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.664007 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.664643 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw6b9\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-kube-api-access-pw6b9\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:40 crc kubenswrapper[4776]: I1208 09:37:40.684700 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:37:41 crc kubenswrapper[4776]: I1208 09:37:41.229987 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml"] Dec 08 09:37:41 crc kubenswrapper[4776]: I1208 09:37:41.293245 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" event={"ID":"7527bd54-54ba-42e5-9ec0-7037536864b9","Type":"ContainerStarted","Data":"96bc34dd039e33113c54664c7d64061e3055f4fd60b60510df0711f1e9e43b44"} Dec 08 09:37:42 crc kubenswrapper[4776]: I1208 09:37:42.308854 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" event={"ID":"7527bd54-54ba-42e5-9ec0-7037536864b9","Type":"ContainerStarted","Data":"ea81c111cf3428fafc1a2ba8406fa6f8c52f5936e32420a1c6b962db3143b311"} Dec 08 09:37:42 crc kubenswrapper[4776]: I1208 09:37:42.341059 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" podStartSLOduration=1.936928846 podStartE2EDuration="2.3410413s" podCreationTimestamp="2025-12-08 09:37:40 +0000 UTC" firstStartedPulling="2025-12-08 09:37:41.231803815 +0000 UTC m=+2337.495028837" lastFinishedPulling="2025-12-08 09:37:41.635916259 +0000 UTC m=+2337.899141291" observedRunningTime="2025-12-08 09:37:42.3320768 +0000 UTC m=+2338.595301822" watchObservedRunningTime="2025-12-08 09:37:42.3410413 +0000 UTC m=+2338.604266322" Dec 08 09:37:49 crc kubenswrapper[4776]: I1208 09:37:49.343403 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:37:49 crc kubenswrapper[4776]: E1208 09:37:49.344241 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:38:04 crc kubenswrapper[4776]: I1208 09:38:04.351568 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:38:04 crc kubenswrapper[4776]: E1208 09:38:04.352312 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:38:08 crc kubenswrapper[4776]: I1208 09:38:08.040647 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-x7p68"] Dec 08 09:38:08 crc kubenswrapper[4776]: I1208 09:38:08.050949 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-x7p68"] Dec 08 09:38:08 crc kubenswrapper[4776]: I1208 09:38:08.360688 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec813763-b43e-4a53-a048-615c313d130a" path="/var/lib/kubelet/pods/ec813763-b43e-4a53-a048-615c313d130a/volumes" Dec 08 09:38:15 crc kubenswrapper[4776]: I1208 09:38:15.343768 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:38:15 crc kubenswrapper[4776]: E1208 09:38:15.344544 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:38:17 crc kubenswrapper[4776]: I1208 09:38:17.901285 4776 scope.go:117] "RemoveContainer" containerID="d85a5b1b9612ad13bce283313fe73d9e689f5e6f3d170da35fef9e8755b654bf" Dec 08 09:38:17 crc kubenswrapper[4776]: I1208 09:38:17.934301 4776 scope.go:117] "RemoveContainer" containerID="5b68070f1ed673b2e4199f75c4fb22993b7055298e67af89b496a5083846f009" Dec 08 09:38:24 crc kubenswrapper[4776]: I1208 09:38:24.756824 4776 generic.go:334] "Generic (PLEG): container finished" podID="7527bd54-54ba-42e5-9ec0-7037536864b9" containerID="ea81c111cf3428fafc1a2ba8406fa6f8c52f5936e32420a1c6b962db3143b311" exitCode=0 Dec 08 09:38:24 crc kubenswrapper[4776]: I1208 09:38:24.756913 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" event={"ID":"7527bd54-54ba-42e5-9ec0-7037536864b9","Type":"ContainerDied","Data":"ea81c111cf3428fafc1a2ba8406fa6f8c52f5936e32420a1c6b962db3143b311"} Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.195292 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.356026 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-libvirt-combined-ca-bundle\") pod \"7527bd54-54ba-42e5-9ec0-7037536864b9\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.356403 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-telemetry-power-monitoring-combined-ca-bundle\") pod \"7527bd54-54ba-42e5-9ec0-7037536864b9\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.356568 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-repo-setup-combined-ca-bundle\") pod \"7527bd54-54ba-42e5-9ec0-7037536864b9\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.356687 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-nova-combined-ca-bundle\") pod \"7527bd54-54ba-42e5-9ec0-7037536864b9\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.356791 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"7527bd54-54ba-42e5-9ec0-7037536864b9\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.356911 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-neutron-metadata-combined-ca-bundle\") pod \"7527bd54-54ba-42e5-9ec0-7037536864b9\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.356982 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-inventory\") pod \"7527bd54-54ba-42e5-9ec0-7037536864b9\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.357042 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-ssh-key\") pod \"7527bd54-54ba-42e5-9ec0-7037536864b9\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.357127 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-telemetry-combined-ca-bundle\") pod \"7527bd54-54ba-42e5-9ec0-7037536864b9\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.357246 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-ovn-combined-ca-bundle\") pod \"7527bd54-54ba-42e5-9ec0-7037536864b9\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.357323 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"7527bd54-54ba-42e5-9ec0-7037536864b9\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.357420 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-bootstrap-combined-ca-bundle\") pod \"7527bd54-54ba-42e5-9ec0-7037536864b9\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.357529 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"7527bd54-54ba-42e5-9ec0-7037536864b9\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.357646 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw6b9\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-kube-api-access-pw6b9\") pod \"7527bd54-54ba-42e5-9ec0-7037536864b9\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.357757 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"7527bd54-54ba-42e5-9ec0-7037536864b9\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.357833 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"7527bd54-54ba-42e5-9ec0-7037536864b9\" (UID: \"7527bd54-54ba-42e5-9ec0-7037536864b9\") " Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.363992 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "7527bd54-54ba-42e5-9ec0-7037536864b9" (UID: "7527bd54-54ba-42e5-9ec0-7037536864b9"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.364327 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7527bd54-54ba-42e5-9ec0-7037536864b9" (UID: "7527bd54-54ba-42e5-9ec0-7037536864b9"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.364509 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7527bd54-54ba-42e5-9ec0-7037536864b9" (UID: "7527bd54-54ba-42e5-9ec0-7037536864b9"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.364889 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7527bd54-54ba-42e5-9ec0-7037536864b9" (UID: "7527bd54-54ba-42e5-9ec0-7037536864b9"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.365212 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7527bd54-54ba-42e5-9ec0-7037536864b9" (UID: "7527bd54-54ba-42e5-9ec0-7037536864b9"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.366019 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7527bd54-54ba-42e5-9ec0-7037536864b9" (UID: "7527bd54-54ba-42e5-9ec0-7037536864b9"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.366046 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7527bd54-54ba-42e5-9ec0-7037536864b9" (UID: "7527bd54-54ba-42e5-9ec0-7037536864b9"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.366426 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "7527bd54-54ba-42e5-9ec0-7037536864b9" (UID: "7527bd54-54ba-42e5-9ec0-7037536864b9"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.367468 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "7527bd54-54ba-42e5-9ec0-7037536864b9" (UID: "7527bd54-54ba-42e5-9ec0-7037536864b9"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.367872 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "7527bd54-54ba-42e5-9ec0-7037536864b9" (UID: "7527bd54-54ba-42e5-9ec0-7037536864b9"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.367986 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "7527bd54-54ba-42e5-9ec0-7037536864b9" (UID: "7527bd54-54ba-42e5-9ec0-7037536864b9"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.368747 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-kube-api-access-pw6b9" (OuterVolumeSpecName: "kube-api-access-pw6b9") pod "7527bd54-54ba-42e5-9ec0-7037536864b9" (UID: "7527bd54-54ba-42e5-9ec0-7037536864b9"). InnerVolumeSpecName "kube-api-access-pw6b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.369145 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "7527bd54-54ba-42e5-9ec0-7037536864b9" (UID: "7527bd54-54ba-42e5-9ec0-7037536864b9"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.369510 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7527bd54-54ba-42e5-9ec0-7037536864b9" (UID: "7527bd54-54ba-42e5-9ec0-7037536864b9"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.402116 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-inventory" (OuterVolumeSpecName: "inventory") pod "7527bd54-54ba-42e5-9ec0-7037536864b9" (UID: "7527bd54-54ba-42e5-9ec0-7037536864b9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.405025 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7527bd54-54ba-42e5-9ec0-7037536864b9" (UID: "7527bd54-54ba-42e5-9ec0-7037536864b9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.460398 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.460428 4776 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.460439 4776 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.460449 4776 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.460458 4776 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.460467 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.460475 4776 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.460484 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.460491 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.460499 4776 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.460509 4776 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.460517 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.460526 4776 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7527bd54-54ba-42e5-9ec0-7037536864b9-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.460535 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.460546 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw6b9\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-kube-api-access-pw6b9\") on node \"crc\" DevicePath \"\"" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.460554 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7527bd54-54ba-42e5-9ec0-7037536864b9-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.777128 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" event={"ID":"7527bd54-54ba-42e5-9ec0-7037536864b9","Type":"ContainerDied","Data":"96bc34dd039e33113c54664c7d64061e3055f4fd60b60510df0711f1e9e43b44"} Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.777193 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96bc34dd039e33113c54664c7d64061e3055f4fd60b60510df0711f1e9e43b44" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.777238 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.879634 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r"] Dec 08 09:38:26 crc kubenswrapper[4776]: E1208 09:38:26.880370 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7527bd54-54ba-42e5-9ec0-7037536864b9" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.880393 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7527bd54-54ba-42e5-9ec0-7037536864b9" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.880674 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7527bd54-54ba-42e5-9ec0-7037536864b9" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.881553 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.883710 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.883854 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.884024 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.884376 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.891158 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:38:26 crc kubenswrapper[4776]: I1208 09:38:26.894485 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r"] Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.074395 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-k4c4r\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.074488 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-k4c4r\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.074541 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86v24\" (UniqueName: \"kubernetes.io/projected/abe6fd93-f916-47f2-854e-fa4d908fa9ad-kube-api-access-86v24\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-k4c4r\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.074620 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-k4c4r\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.074748 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-k4c4r\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.177116 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-k4c4r\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.177424 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86v24\" (UniqueName: \"kubernetes.io/projected/abe6fd93-f916-47f2-854e-fa4d908fa9ad-kube-api-access-86v24\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-k4c4r\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.177590 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-k4c4r\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.177717 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-k4c4r\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.177780 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-k4c4r\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.178192 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-k4c4r\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.182008 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-k4c4r\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.182163 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-k4c4r\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.182620 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-k4c4r\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.194985 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86v24\" (UniqueName: \"kubernetes.io/projected/abe6fd93-f916-47f2-854e-fa4d908fa9ad-kube-api-access-86v24\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-k4c4r\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.199141 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:38:27 crc kubenswrapper[4776]: I1208 09:38:27.899091 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r"] Dec 08 09:38:28 crc kubenswrapper[4776]: I1208 09:38:28.801679 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" event={"ID":"abe6fd93-f916-47f2-854e-fa4d908fa9ad","Type":"ContainerStarted","Data":"c9b6fe3b61bf58d75db33fc83334602c65f0dd91aa96f3d873d77e975bcc654d"} Dec 08 09:38:28 crc kubenswrapper[4776]: I1208 09:38:28.803462 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" event={"ID":"abe6fd93-f916-47f2-854e-fa4d908fa9ad","Type":"ContainerStarted","Data":"70feb44c96f7a6cecde0864a595bb96f500a0292527fa293fa80f578b7591571"} Dec 08 09:38:28 crc kubenswrapper[4776]: I1208 09:38:28.832421 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" podStartSLOduration=2.29350091 podStartE2EDuration="2.83240266s" podCreationTimestamp="2025-12-08 09:38:26 +0000 UTC" firstStartedPulling="2025-12-08 09:38:27.89840269 +0000 UTC m=+2384.161627712" lastFinishedPulling="2025-12-08 09:38:28.43730444 +0000 UTC m=+2384.700529462" observedRunningTime="2025-12-08 09:38:28.826685736 +0000 UTC m=+2385.089910758" watchObservedRunningTime="2025-12-08 09:38:28.83240266 +0000 UTC m=+2385.095627682" Dec 08 09:38:30 crc kubenswrapper[4776]: I1208 09:38:30.344454 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:38:30 crc kubenswrapper[4776]: E1208 09:38:30.345053 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:38:42 crc kubenswrapper[4776]: I1208 09:38:42.344955 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:38:42 crc kubenswrapper[4776]: E1208 09:38:42.345601 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:38:53 crc kubenswrapper[4776]: I1208 09:38:53.344476 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:38:53 crc kubenswrapper[4776]: E1208 09:38:53.345320 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:39:07 crc kubenswrapper[4776]: I1208 09:39:07.344816 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:39:07 crc kubenswrapper[4776]: E1208 09:39:07.347001 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:39:22 crc kubenswrapper[4776]: I1208 09:39:22.345419 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:39:22 crc kubenswrapper[4776]: E1208 09:39:22.346561 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:39:29 crc kubenswrapper[4776]: I1208 09:39:29.441423 4776 generic.go:334] "Generic (PLEG): container finished" podID="abe6fd93-f916-47f2-854e-fa4d908fa9ad" containerID="c9b6fe3b61bf58d75db33fc83334602c65f0dd91aa96f3d873d77e975bcc654d" exitCode=0 Dec 08 09:39:29 crc kubenswrapper[4776]: I1208 09:39:29.441542 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" event={"ID":"abe6fd93-f916-47f2-854e-fa4d908fa9ad","Type":"ContainerDied","Data":"c9b6fe3b61bf58d75db33fc83334602c65f0dd91aa96f3d873d77e975bcc654d"} Dec 08 09:39:30 crc kubenswrapper[4776]: I1208 09:39:30.945960 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.084188 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ovncontroller-config-0\") pod \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.084320 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86v24\" (UniqueName: \"kubernetes.io/projected/abe6fd93-f916-47f2-854e-fa4d908fa9ad-kube-api-access-86v24\") pod \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.084444 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-inventory\") pod \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.084601 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ssh-key\") pod \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.084675 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ovn-combined-ca-bundle\") pod \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\" (UID: \"abe6fd93-f916-47f2-854e-fa4d908fa9ad\") " Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.090039 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe6fd93-f916-47f2-854e-fa4d908fa9ad-kube-api-access-86v24" (OuterVolumeSpecName: "kube-api-access-86v24") pod "abe6fd93-f916-47f2-854e-fa4d908fa9ad" (UID: "abe6fd93-f916-47f2-854e-fa4d908fa9ad"). InnerVolumeSpecName "kube-api-access-86v24". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.090689 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "abe6fd93-f916-47f2-854e-fa4d908fa9ad" (UID: "abe6fd93-f916-47f2-854e-fa4d908fa9ad"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.115670 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-inventory" (OuterVolumeSpecName: "inventory") pod "abe6fd93-f916-47f2-854e-fa4d908fa9ad" (UID: "abe6fd93-f916-47f2-854e-fa4d908fa9ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.122721 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "abe6fd93-f916-47f2-854e-fa4d908fa9ad" (UID: "abe6fd93-f916-47f2-854e-fa4d908fa9ad"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.125241 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "abe6fd93-f916-47f2-854e-fa4d908fa9ad" (UID: "abe6fd93-f916-47f2-854e-fa4d908fa9ad"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.188544 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.188907 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.188920 4776 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.188940 4776 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/abe6fd93-f916-47f2-854e-fa4d908fa9ad-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.188951 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86v24\" (UniqueName: \"kubernetes.io/projected/abe6fd93-f916-47f2-854e-fa4d908fa9ad-kube-api-access-86v24\") on node \"crc\" DevicePath \"\"" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.462485 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" event={"ID":"abe6fd93-f916-47f2-854e-fa4d908fa9ad","Type":"ContainerDied","Data":"70feb44c96f7a6cecde0864a595bb96f500a0292527fa293fa80f578b7591571"} Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.462528 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70feb44c96f7a6cecde0864a595bb96f500a0292527fa293fa80f578b7591571" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.462595 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-k4c4r" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.557260 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt"] Dec 08 09:39:31 crc kubenswrapper[4776]: E1208 09:39:31.557873 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe6fd93-f916-47f2-854e-fa4d908fa9ad" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.557899 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe6fd93-f916-47f2-854e-fa4d908fa9ad" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.558281 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="abe6fd93-f916-47f2-854e-fa4d908fa9ad" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.559380 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.562811 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.563124 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.563325 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.563633 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.563783 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.564540 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.575756 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt"] Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.699352 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.699409 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.699439 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.699465 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.699484 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.699524 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnzbh\" (UniqueName: \"kubernetes.io/projected/30f7ff02-8887-44e7-a223-335cd93255ef-kube-api-access-dnzbh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.801438 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.801693 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.801810 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.801983 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.802459 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.802875 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnzbh\" (UniqueName: \"kubernetes.io/projected/30f7ff02-8887-44e7-a223-335cd93255ef-kube-api-access-dnzbh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.806031 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.806043 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.809667 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.818842 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.819740 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnzbh\" (UniqueName: \"kubernetes.io/projected/30f7ff02-8887-44e7-a223-335cd93255ef-kube-api-access-dnzbh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.821217 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:31 crc kubenswrapper[4776]: I1208 09:39:31.885617 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:39:32 crc kubenswrapper[4776]: I1208 09:39:32.435325 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt"] Dec 08 09:39:32 crc kubenswrapper[4776]: I1208 09:39:32.483331 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" event={"ID":"30f7ff02-8887-44e7-a223-335cd93255ef","Type":"ContainerStarted","Data":"69405b206c502d42e7bca2be9a9f1d927a4189f26df1fdfc8f8973b61f16d1ab"} Dec 08 09:39:33 crc kubenswrapper[4776]: I1208 09:39:33.500800 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" event={"ID":"30f7ff02-8887-44e7-a223-335cd93255ef","Type":"ContainerStarted","Data":"e3fe3c98ff150a638ef47b54e579412e4cf4f4fff4f24e747719ded02b9f6eb7"} Dec 08 09:39:33 crc kubenswrapper[4776]: I1208 09:39:33.538448 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" podStartSLOduration=2.037493315 podStartE2EDuration="2.538428382s" podCreationTimestamp="2025-12-08 09:39:31 +0000 UTC" firstStartedPulling="2025-12-08 09:39:32.438736544 +0000 UTC m=+2448.701961566" lastFinishedPulling="2025-12-08 09:39:32.939671611 +0000 UTC m=+2449.202896633" observedRunningTime="2025-12-08 09:39:33.521718052 +0000 UTC m=+2449.784943104" watchObservedRunningTime="2025-12-08 09:39:33.538428382 +0000 UTC m=+2449.801653404" Dec 08 09:39:36 crc kubenswrapper[4776]: I1208 09:39:36.343391 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:39:36 crc kubenswrapper[4776]: E1208 09:39:36.343933 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:39:47 crc kubenswrapper[4776]: I1208 09:39:47.344498 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:39:47 crc kubenswrapper[4776]: E1208 09:39:47.345290 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:39:59 crc kubenswrapper[4776]: I1208 09:39:59.344338 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:39:59 crc kubenswrapper[4776]: E1208 09:39:59.345498 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:40:12 crc kubenswrapper[4776]: I1208 09:40:12.343807 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:40:12 crc kubenswrapper[4776]: E1208 09:40:12.344711 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:40:18 crc kubenswrapper[4776]: I1208 09:40:18.955730 4776 generic.go:334] "Generic (PLEG): container finished" podID="30f7ff02-8887-44e7-a223-335cd93255ef" containerID="e3fe3c98ff150a638ef47b54e579412e4cf4f4fff4f24e747719ded02b9f6eb7" exitCode=0 Dec 08 09:40:18 crc kubenswrapper[4776]: I1208 09:40:18.955816 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" event={"ID":"30f7ff02-8887-44e7-a223-335cd93255ef","Type":"ContainerDied","Data":"e3fe3c98ff150a638ef47b54e579412e4cf4f4fff4f24e747719ded02b9f6eb7"} Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.457807 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.545938 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-inventory\") pod \"30f7ff02-8887-44e7-a223-335cd93255ef\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.546024 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnzbh\" (UniqueName: \"kubernetes.io/projected/30f7ff02-8887-44e7-a223-335cd93255ef-kube-api-access-dnzbh\") pod \"30f7ff02-8887-44e7-a223-335cd93255ef\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.546154 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-nova-metadata-neutron-config-0\") pod \"30f7ff02-8887-44e7-a223-335cd93255ef\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.546249 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-ssh-key\") pod \"30f7ff02-8887-44e7-a223-335cd93255ef\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.546280 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-neutron-ovn-metadata-agent-neutron-config-0\") pod \"30f7ff02-8887-44e7-a223-335cd93255ef\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.546320 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-neutron-metadata-combined-ca-bundle\") pod \"30f7ff02-8887-44e7-a223-335cd93255ef\" (UID: \"30f7ff02-8887-44e7-a223-335cd93255ef\") " Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.553337 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "30f7ff02-8887-44e7-a223-335cd93255ef" (UID: "30f7ff02-8887-44e7-a223-335cd93255ef"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.553650 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f7ff02-8887-44e7-a223-335cd93255ef-kube-api-access-dnzbh" (OuterVolumeSpecName: "kube-api-access-dnzbh") pod "30f7ff02-8887-44e7-a223-335cd93255ef" (UID: "30f7ff02-8887-44e7-a223-335cd93255ef"). InnerVolumeSpecName "kube-api-access-dnzbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.585053 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "30f7ff02-8887-44e7-a223-335cd93255ef" (UID: "30f7ff02-8887-44e7-a223-335cd93255ef"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.586659 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-inventory" (OuterVolumeSpecName: "inventory") pod "30f7ff02-8887-44e7-a223-335cd93255ef" (UID: "30f7ff02-8887-44e7-a223-335cd93255ef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.601729 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "30f7ff02-8887-44e7-a223-335cd93255ef" (UID: "30f7ff02-8887-44e7-a223-335cd93255ef"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.603244 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "30f7ff02-8887-44e7-a223-335cd93255ef" (UID: "30f7ff02-8887-44e7-a223-335cd93255ef"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.649286 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.649321 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnzbh\" (UniqueName: \"kubernetes.io/projected/30f7ff02-8887-44e7-a223-335cd93255ef-kube-api-access-dnzbh\") on node \"crc\" DevicePath \"\"" Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.649332 4776 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.649340 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.649349 4776 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.649358 4776 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f7ff02-8887-44e7-a223-335cd93255ef-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.978814 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" event={"ID":"30f7ff02-8887-44e7-a223-335cd93255ef","Type":"ContainerDied","Data":"69405b206c502d42e7bca2be9a9f1d927a4189f26df1fdfc8f8973b61f16d1ab"} Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.979189 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69405b206c502d42e7bca2be9a9f1d927a4189f26df1fdfc8f8973b61f16d1ab" Dec 08 09:40:20 crc kubenswrapper[4776]: I1208 09:40:20.978859 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.075751 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw"] Dec 08 09:40:21 crc kubenswrapper[4776]: E1208 09:40:21.076301 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f7ff02-8887-44e7-a223-335cd93255ef" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.076319 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f7ff02-8887-44e7-a223-335cd93255ef" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.076660 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f7ff02-8887-44e7-a223-335cd93255ef" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.077503 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw"] Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.077585 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.114809 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.114994 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.115154 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.115357 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.118194 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.159865 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-28blw\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.159962 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngdq9\" (UniqueName: \"kubernetes.io/projected/3933dc31-4df5-46ec-8fe0-62b9771c5515-kube-api-access-ngdq9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-28blw\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.160010 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-28blw\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.160121 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-28blw\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.160212 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-28blw\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.262156 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-28blw\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.262544 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngdq9\" (UniqueName: \"kubernetes.io/projected/3933dc31-4df5-46ec-8fe0-62b9771c5515-kube-api-access-ngdq9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-28blw\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.262659 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-28blw\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.262813 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-28blw\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.262950 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-28blw\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.267097 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-28blw\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.269229 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-28blw\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.269440 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-28blw\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.274898 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-28blw\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.285785 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngdq9\" (UniqueName: \"kubernetes.io/projected/3933dc31-4df5-46ec-8fe0-62b9771c5515-kube-api-access-ngdq9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-28blw\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:21 crc kubenswrapper[4776]: I1208 09:40:21.439616 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:40:22 crc kubenswrapper[4776]: I1208 09:40:22.097259 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw"] Dec 08 09:40:23 crc kubenswrapper[4776]: I1208 09:40:23.027614 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" event={"ID":"3933dc31-4df5-46ec-8fe0-62b9771c5515","Type":"ContainerStarted","Data":"eeeaf1adcf2c8d7bc5cf22a9827814e0b32d1b38b04d8393f69f3898441c7c39"} Dec 08 09:40:24 crc kubenswrapper[4776]: I1208 09:40:24.038380 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" event={"ID":"3933dc31-4df5-46ec-8fe0-62b9771c5515","Type":"ContainerStarted","Data":"60fcda787c5fffe85397aa6da9616e13a4c957677ebe65248f38cc8a8d5f0c29"} Dec 08 09:40:24 crc kubenswrapper[4776]: I1208 09:40:24.056544 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" podStartSLOduration=2.383202729 podStartE2EDuration="3.056522626s" podCreationTimestamp="2025-12-08 09:40:21 +0000 UTC" firstStartedPulling="2025-12-08 09:40:22.101300058 +0000 UTC m=+2498.364525080" lastFinishedPulling="2025-12-08 09:40:22.774619955 +0000 UTC m=+2499.037844977" observedRunningTime="2025-12-08 09:40:24.052854897 +0000 UTC m=+2500.316079919" watchObservedRunningTime="2025-12-08 09:40:24.056522626 +0000 UTC m=+2500.319747648" Dec 08 09:40:25 crc kubenswrapper[4776]: I1208 09:40:25.343949 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:40:25 crc kubenswrapper[4776]: E1208 09:40:25.344316 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:40:36 crc kubenswrapper[4776]: I1208 09:40:36.344102 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:40:36 crc kubenswrapper[4776]: E1208 09:40:36.344901 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:40:51 crc kubenswrapper[4776]: I1208 09:40:51.343819 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:40:51 crc kubenswrapper[4776]: E1208 09:40:51.344719 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:41:04 crc kubenswrapper[4776]: I1208 09:41:04.353011 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:41:04 crc kubenswrapper[4776]: E1208 09:41:04.353738 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:41:18 crc kubenswrapper[4776]: I1208 09:41:18.343809 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:41:18 crc kubenswrapper[4776]: E1208 09:41:18.344669 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:41:29 crc kubenswrapper[4776]: I1208 09:41:29.344392 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:41:29 crc kubenswrapper[4776]: E1208 09:41:29.345420 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:41:41 crc kubenswrapper[4776]: I1208 09:41:41.344507 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:41:41 crc kubenswrapper[4776]: E1208 09:41:41.346390 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:41:55 crc kubenswrapper[4776]: I1208 09:41:55.343802 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:41:55 crc kubenswrapper[4776]: I1208 09:41:55.963478 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"51d9d9755c704c9d9d8e04dd484f9ff425e26c39d52bf197718b780d0db4e339"} Dec 08 09:44:11 crc kubenswrapper[4776]: I1208 09:44:11.399561 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:44:11 crc kubenswrapper[4776]: I1208 09:44:11.401116 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:44:34 crc kubenswrapper[4776]: I1208 09:44:34.931745 4776 generic.go:334] "Generic (PLEG): container finished" podID="3933dc31-4df5-46ec-8fe0-62b9771c5515" containerID="60fcda787c5fffe85397aa6da9616e13a4c957677ebe65248f38cc8a8d5f0c29" exitCode=0 Dec 08 09:44:34 crc kubenswrapper[4776]: I1208 09:44:34.931893 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" event={"ID":"3933dc31-4df5-46ec-8fe0-62b9771c5515","Type":"ContainerDied","Data":"60fcda787c5fffe85397aa6da9616e13a4c957677ebe65248f38cc8a8d5f0c29"} Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.423797 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.589734 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngdq9\" (UniqueName: \"kubernetes.io/projected/3933dc31-4df5-46ec-8fe0-62b9771c5515-kube-api-access-ngdq9\") pod \"3933dc31-4df5-46ec-8fe0-62b9771c5515\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.589906 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-libvirt-secret-0\") pod \"3933dc31-4df5-46ec-8fe0-62b9771c5515\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.589954 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-libvirt-combined-ca-bundle\") pod \"3933dc31-4df5-46ec-8fe0-62b9771c5515\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.590079 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-inventory\") pod \"3933dc31-4df5-46ec-8fe0-62b9771c5515\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.590125 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-ssh-key\") pod \"3933dc31-4df5-46ec-8fe0-62b9771c5515\" (UID: \"3933dc31-4df5-46ec-8fe0-62b9771c5515\") " Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.595863 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "3933dc31-4df5-46ec-8fe0-62b9771c5515" (UID: "3933dc31-4df5-46ec-8fe0-62b9771c5515"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.599146 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3933dc31-4df5-46ec-8fe0-62b9771c5515-kube-api-access-ngdq9" (OuterVolumeSpecName: "kube-api-access-ngdq9") pod "3933dc31-4df5-46ec-8fe0-62b9771c5515" (UID: "3933dc31-4df5-46ec-8fe0-62b9771c5515"). InnerVolumeSpecName "kube-api-access-ngdq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.623455 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "3933dc31-4df5-46ec-8fe0-62b9771c5515" (UID: "3933dc31-4df5-46ec-8fe0-62b9771c5515"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.624665 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-inventory" (OuterVolumeSpecName: "inventory") pod "3933dc31-4df5-46ec-8fe0-62b9771c5515" (UID: "3933dc31-4df5-46ec-8fe0-62b9771c5515"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.627909 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3933dc31-4df5-46ec-8fe0-62b9771c5515" (UID: "3933dc31-4df5-46ec-8fe0-62b9771c5515"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.693018 4776 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.693052 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.693060 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.693069 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngdq9\" (UniqueName: \"kubernetes.io/projected/3933dc31-4df5-46ec-8fe0-62b9771c5515-kube-api-access-ngdq9\") on node \"crc\" DevicePath \"\"" Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.693078 4776 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3933dc31-4df5-46ec-8fe0-62b9771c5515-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.953352 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" event={"ID":"3933dc31-4df5-46ec-8fe0-62b9771c5515","Type":"ContainerDied","Data":"eeeaf1adcf2c8d7bc5cf22a9827814e0b32d1b38b04d8393f69f3898441c7c39"} Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.953396 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eeeaf1adcf2c8d7bc5cf22a9827814e0b32d1b38b04d8393f69f3898441c7c39" Dec 08 09:44:36 crc kubenswrapper[4776]: I1208 09:44:36.953412 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-28blw" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.052856 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp"] Dec 08 09:44:37 crc kubenswrapper[4776]: E1208 09:44:37.053453 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3933dc31-4df5-46ec-8fe0-62b9771c5515" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.053474 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3933dc31-4df5-46ec-8fe0-62b9771c5515" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.053741 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="3933dc31-4df5-46ec-8fe0-62b9771c5515" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.054796 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.057558 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.057704 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.057849 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.058237 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.058249 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.058552 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.058694 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.066494 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp"] Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.203364 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.203472 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.203586 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.203617 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z82s2\" (UniqueName: \"kubernetes.io/projected/9cd841cc-611f-406b-b9d5-8c242c1321ba-kube-api-access-z82s2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.203704 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.203722 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.203804 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.203944 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.204038 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.306021 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.306072 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.306157 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.306243 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.306289 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.306323 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.306351 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.306419 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.306444 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z82s2\" (UniqueName: \"kubernetes.io/projected/9cd841cc-611f-406b-b9d5-8c242c1321ba-kube-api-access-z82s2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.308095 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.311300 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.311795 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.311818 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.312268 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.312618 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.312795 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.313357 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.324929 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z82s2\" (UniqueName: \"kubernetes.io/projected/9cd841cc-611f-406b-b9d5-8c242c1321ba-kube-api-access-z82s2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tnlnp\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.407417 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.962008 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp"] Dec 08 09:44:37 crc kubenswrapper[4776]: I1208 09:44:37.967917 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 09:44:38 crc kubenswrapper[4776]: I1208 09:44:38.981027 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" event={"ID":"9cd841cc-611f-406b-b9d5-8c242c1321ba","Type":"ContainerStarted","Data":"1f9dfa4aec860bc6ebbc9451504213653443249096a55cbb253caf98eb110f8f"} Dec 08 09:44:38 crc kubenswrapper[4776]: I1208 09:44:38.981382 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" event={"ID":"9cd841cc-611f-406b-b9d5-8c242c1321ba","Type":"ContainerStarted","Data":"289c8ac893b0ccb0717b004ca3a7b9e32084d9423ed90bbce073f3072998ba5b"} Dec 08 09:44:39 crc kubenswrapper[4776]: I1208 09:44:39.000044 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" podStartSLOduration=1.5673751710000001 podStartE2EDuration="2.000022322s" podCreationTimestamp="2025-12-08 09:44:37 +0000 UTC" firstStartedPulling="2025-12-08 09:44:37.967621384 +0000 UTC m=+2754.230846406" lastFinishedPulling="2025-12-08 09:44:38.400268535 +0000 UTC m=+2754.663493557" observedRunningTime="2025-12-08 09:44:38.999811827 +0000 UTC m=+2755.263036869" watchObservedRunningTime="2025-12-08 09:44:39.000022322 +0000 UTC m=+2755.263247354" Dec 08 09:44:41 crc kubenswrapper[4776]: I1208 09:44:41.398885 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:44:41 crc kubenswrapper[4776]: I1208 09:44:41.399133 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:45:00 crc kubenswrapper[4776]: I1208 09:45:00.167091 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2"] Dec 08 09:45:00 crc kubenswrapper[4776]: I1208 09:45:00.182752 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2" Dec 08 09:45:00 crc kubenswrapper[4776]: I1208 09:45:00.189243 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 08 09:45:00 crc kubenswrapper[4776]: I1208 09:45:00.190232 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 08 09:45:00 crc kubenswrapper[4776]: I1208 09:45:00.197396 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2"] Dec 08 09:45:00 crc kubenswrapper[4776]: I1208 09:45:00.296133 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48ff7820-516e-4d30-9d51-e9a9c7582c81-secret-volume\") pod \"collect-profiles-29419785-tvxx2\" (UID: \"48ff7820-516e-4d30-9d51-e9a9c7582c81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2" Dec 08 09:45:00 crc kubenswrapper[4776]: I1208 09:45:00.296223 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48ff7820-516e-4d30-9d51-e9a9c7582c81-config-volume\") pod \"collect-profiles-29419785-tvxx2\" (UID: \"48ff7820-516e-4d30-9d51-e9a9c7582c81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2" Dec 08 09:45:00 crc kubenswrapper[4776]: I1208 09:45:00.296248 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpzsf\" (UniqueName: \"kubernetes.io/projected/48ff7820-516e-4d30-9d51-e9a9c7582c81-kube-api-access-vpzsf\") pod \"collect-profiles-29419785-tvxx2\" (UID: \"48ff7820-516e-4d30-9d51-e9a9c7582c81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2" Dec 08 09:45:00 crc kubenswrapper[4776]: I1208 09:45:00.397770 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48ff7820-516e-4d30-9d51-e9a9c7582c81-config-volume\") pod \"collect-profiles-29419785-tvxx2\" (UID: \"48ff7820-516e-4d30-9d51-e9a9c7582c81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2" Dec 08 09:45:00 crc kubenswrapper[4776]: I1208 09:45:00.397821 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpzsf\" (UniqueName: \"kubernetes.io/projected/48ff7820-516e-4d30-9d51-e9a9c7582c81-kube-api-access-vpzsf\") pod \"collect-profiles-29419785-tvxx2\" (UID: \"48ff7820-516e-4d30-9d51-e9a9c7582c81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2" Dec 08 09:45:00 crc kubenswrapper[4776]: I1208 09:45:00.398045 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48ff7820-516e-4d30-9d51-e9a9c7582c81-secret-volume\") pod \"collect-profiles-29419785-tvxx2\" (UID: \"48ff7820-516e-4d30-9d51-e9a9c7582c81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2" Dec 08 09:45:00 crc kubenswrapper[4776]: I1208 09:45:00.398842 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48ff7820-516e-4d30-9d51-e9a9c7582c81-config-volume\") pod \"collect-profiles-29419785-tvxx2\" (UID: \"48ff7820-516e-4d30-9d51-e9a9c7582c81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2" Dec 08 09:45:00 crc kubenswrapper[4776]: I1208 09:45:00.405434 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48ff7820-516e-4d30-9d51-e9a9c7582c81-secret-volume\") pod \"collect-profiles-29419785-tvxx2\" (UID: \"48ff7820-516e-4d30-9d51-e9a9c7582c81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2" Dec 08 09:45:00 crc kubenswrapper[4776]: I1208 09:45:00.414435 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpzsf\" (UniqueName: \"kubernetes.io/projected/48ff7820-516e-4d30-9d51-e9a9c7582c81-kube-api-access-vpzsf\") pod \"collect-profiles-29419785-tvxx2\" (UID: \"48ff7820-516e-4d30-9d51-e9a9c7582c81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2" Dec 08 09:45:00 crc kubenswrapper[4776]: I1208 09:45:00.520699 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2" Dec 08 09:45:01 crc kubenswrapper[4776]: I1208 09:45:01.100268 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2"] Dec 08 09:45:01 crc kubenswrapper[4776]: W1208 09:45:01.101901 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48ff7820_516e_4d30_9d51_e9a9c7582c81.slice/crio-3583648eb6d82627fb2babf6c72e650e93e9e37d83bbb4c36ff0b00c27b3db6e WatchSource:0}: Error finding container 3583648eb6d82627fb2babf6c72e650e93e9e37d83bbb4c36ff0b00c27b3db6e: Status 404 returned error can't find the container with id 3583648eb6d82627fb2babf6c72e650e93e9e37d83bbb4c36ff0b00c27b3db6e Dec 08 09:45:01 crc kubenswrapper[4776]: I1208 09:45:01.248718 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2" event={"ID":"48ff7820-516e-4d30-9d51-e9a9c7582c81","Type":"ContainerStarted","Data":"3583648eb6d82627fb2babf6c72e650e93e9e37d83bbb4c36ff0b00c27b3db6e"} Dec 08 09:45:01 crc kubenswrapper[4776]: I1208 09:45:01.521878 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hhrr2"] Dec 08 09:45:01 crc kubenswrapper[4776]: I1208 09:45:01.525058 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:01 crc kubenswrapper[4776]: I1208 09:45:01.546980 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hhrr2"] Dec 08 09:45:01 crc kubenswrapper[4776]: I1208 09:45:01.641575 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5fdz\" (UniqueName: \"kubernetes.io/projected/8c64d647-91c0-468c-b1ce-d4dd37f59612-kube-api-access-d5fdz\") pod \"redhat-marketplace-hhrr2\" (UID: \"8c64d647-91c0-468c-b1ce-d4dd37f59612\") " pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:01 crc kubenswrapper[4776]: I1208 09:45:01.641789 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c64d647-91c0-468c-b1ce-d4dd37f59612-catalog-content\") pod \"redhat-marketplace-hhrr2\" (UID: \"8c64d647-91c0-468c-b1ce-d4dd37f59612\") " pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:01 crc kubenswrapper[4776]: I1208 09:45:01.641835 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c64d647-91c0-468c-b1ce-d4dd37f59612-utilities\") pod \"redhat-marketplace-hhrr2\" (UID: \"8c64d647-91c0-468c-b1ce-d4dd37f59612\") " pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:01 crc kubenswrapper[4776]: I1208 09:45:01.743819 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c64d647-91c0-468c-b1ce-d4dd37f59612-utilities\") pod \"redhat-marketplace-hhrr2\" (UID: \"8c64d647-91c0-468c-b1ce-d4dd37f59612\") " pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:01 crc kubenswrapper[4776]: I1208 09:45:01.743885 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5fdz\" (UniqueName: \"kubernetes.io/projected/8c64d647-91c0-468c-b1ce-d4dd37f59612-kube-api-access-d5fdz\") pod \"redhat-marketplace-hhrr2\" (UID: \"8c64d647-91c0-468c-b1ce-d4dd37f59612\") " pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:01 crc kubenswrapper[4776]: I1208 09:45:01.744079 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c64d647-91c0-468c-b1ce-d4dd37f59612-catalog-content\") pod \"redhat-marketplace-hhrr2\" (UID: \"8c64d647-91c0-468c-b1ce-d4dd37f59612\") " pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:01 crc kubenswrapper[4776]: I1208 09:45:01.744427 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c64d647-91c0-468c-b1ce-d4dd37f59612-utilities\") pod \"redhat-marketplace-hhrr2\" (UID: \"8c64d647-91c0-468c-b1ce-d4dd37f59612\") " pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:01 crc kubenswrapper[4776]: I1208 09:45:01.744481 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c64d647-91c0-468c-b1ce-d4dd37f59612-catalog-content\") pod \"redhat-marketplace-hhrr2\" (UID: \"8c64d647-91c0-468c-b1ce-d4dd37f59612\") " pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:01 crc kubenswrapper[4776]: I1208 09:45:01.771974 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5fdz\" (UniqueName: \"kubernetes.io/projected/8c64d647-91c0-468c-b1ce-d4dd37f59612-kube-api-access-d5fdz\") pod \"redhat-marketplace-hhrr2\" (UID: \"8c64d647-91c0-468c-b1ce-d4dd37f59612\") " pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:01 crc kubenswrapper[4776]: I1208 09:45:01.898975 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:02 crc kubenswrapper[4776]: I1208 09:45:02.274057 4776 generic.go:334] "Generic (PLEG): container finished" podID="48ff7820-516e-4d30-9d51-e9a9c7582c81" containerID="2e8034c8fe50566431108811532369c91f213b857be245374ec4126f41f80494" exitCode=0 Dec 08 09:45:02 crc kubenswrapper[4776]: I1208 09:45:02.274575 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2" event={"ID":"48ff7820-516e-4d30-9d51-e9a9c7582c81","Type":"ContainerDied","Data":"2e8034c8fe50566431108811532369c91f213b857be245374ec4126f41f80494"} Dec 08 09:45:02 crc kubenswrapper[4776]: I1208 09:45:02.478689 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hhrr2"] Dec 08 09:45:03 crc kubenswrapper[4776]: I1208 09:45:03.289531 4776 generic.go:334] "Generic (PLEG): container finished" podID="8c64d647-91c0-468c-b1ce-d4dd37f59612" containerID="a2e272ed8bf86ca0d5e4c5a297cf6b84a5a18837d2dc6085f44b67beb40a0dd7" exitCode=0 Dec 08 09:45:03 crc kubenswrapper[4776]: I1208 09:45:03.291670 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhrr2" event={"ID":"8c64d647-91c0-468c-b1ce-d4dd37f59612","Type":"ContainerDied","Data":"a2e272ed8bf86ca0d5e4c5a297cf6b84a5a18837d2dc6085f44b67beb40a0dd7"} Dec 08 09:45:03 crc kubenswrapper[4776]: I1208 09:45:03.291824 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhrr2" event={"ID":"8c64d647-91c0-468c-b1ce-d4dd37f59612","Type":"ContainerStarted","Data":"3374442ba08463c315fa3ae7d53d07f6d48d6a4b1ad9821353bee459985d8a2d"} Dec 08 09:45:03 crc kubenswrapper[4776]: I1208 09:45:03.757481 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2" Dec 08 09:45:03 crc kubenswrapper[4776]: I1208 09:45:03.805706 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpzsf\" (UniqueName: \"kubernetes.io/projected/48ff7820-516e-4d30-9d51-e9a9c7582c81-kube-api-access-vpzsf\") pod \"48ff7820-516e-4d30-9d51-e9a9c7582c81\" (UID: \"48ff7820-516e-4d30-9d51-e9a9c7582c81\") " Dec 08 09:45:03 crc kubenswrapper[4776]: I1208 09:45:03.807217 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48ff7820-516e-4d30-9d51-e9a9c7582c81-config-volume\") pod \"48ff7820-516e-4d30-9d51-e9a9c7582c81\" (UID: \"48ff7820-516e-4d30-9d51-e9a9c7582c81\") " Dec 08 09:45:03 crc kubenswrapper[4776]: I1208 09:45:03.807455 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48ff7820-516e-4d30-9d51-e9a9c7582c81-secret-volume\") pod \"48ff7820-516e-4d30-9d51-e9a9c7582c81\" (UID: \"48ff7820-516e-4d30-9d51-e9a9c7582c81\") " Dec 08 09:45:03 crc kubenswrapper[4776]: I1208 09:45:03.810065 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48ff7820-516e-4d30-9d51-e9a9c7582c81-config-volume" (OuterVolumeSpecName: "config-volume") pod "48ff7820-516e-4d30-9d51-e9a9c7582c81" (UID: "48ff7820-516e-4d30-9d51-e9a9c7582c81"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:45:03 crc kubenswrapper[4776]: I1208 09:45:03.820819 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48ff7820-516e-4d30-9d51-e9a9c7582c81-kube-api-access-vpzsf" (OuterVolumeSpecName: "kube-api-access-vpzsf") pod "48ff7820-516e-4d30-9d51-e9a9c7582c81" (UID: "48ff7820-516e-4d30-9d51-e9a9c7582c81"). InnerVolumeSpecName "kube-api-access-vpzsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:45:03 crc kubenswrapper[4776]: I1208 09:45:03.820968 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ff7820-516e-4d30-9d51-e9a9c7582c81-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "48ff7820-516e-4d30-9d51-e9a9c7582c81" (UID: "48ff7820-516e-4d30-9d51-e9a9c7582c81"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:45:03 crc kubenswrapper[4776]: I1208 09:45:03.913456 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48ff7820-516e-4d30-9d51-e9a9c7582c81-config-volume\") on node \"crc\" DevicePath \"\"" Dec 08 09:45:03 crc kubenswrapper[4776]: I1208 09:45:03.913496 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48ff7820-516e-4d30-9d51-e9a9c7582c81-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 08 09:45:03 crc kubenswrapper[4776]: I1208 09:45:03.913507 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpzsf\" (UniqueName: \"kubernetes.io/projected/48ff7820-516e-4d30-9d51-e9a9c7582c81-kube-api-access-vpzsf\") on node \"crc\" DevicePath \"\"" Dec 08 09:45:04 crc kubenswrapper[4776]: I1208 09:45:04.302886 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2" event={"ID":"48ff7820-516e-4d30-9d51-e9a9c7582c81","Type":"ContainerDied","Data":"3583648eb6d82627fb2babf6c72e650e93e9e37d83bbb4c36ff0b00c27b3db6e"} Dec 08 09:45:04 crc kubenswrapper[4776]: I1208 09:45:04.303259 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3583648eb6d82627fb2babf6c72e650e93e9e37d83bbb4c36ff0b00c27b3db6e" Dec 08 09:45:04 crc kubenswrapper[4776]: I1208 09:45:04.302922 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2" Dec 08 09:45:04 crc kubenswrapper[4776]: I1208 09:45:04.305672 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhrr2" event={"ID":"8c64d647-91c0-468c-b1ce-d4dd37f59612","Type":"ContainerStarted","Data":"4abd8cfb129f382dd8827a8ac014d01e116bcdba0f51ae5374bea9a7372334e6"} Dec 08 09:45:05 crc kubenswrapper[4776]: I1208 09:45:05.023767 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4"] Dec 08 09:45:05 crc kubenswrapper[4776]: I1208 09:45:05.035133 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419740-xnvv4"] Dec 08 09:45:05 crc kubenswrapper[4776]: I1208 09:45:05.319596 4776 generic.go:334] "Generic (PLEG): container finished" podID="8c64d647-91c0-468c-b1ce-d4dd37f59612" containerID="4abd8cfb129f382dd8827a8ac014d01e116bcdba0f51ae5374bea9a7372334e6" exitCode=0 Dec 08 09:45:05 crc kubenswrapper[4776]: I1208 09:45:05.319820 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhrr2" event={"ID":"8c64d647-91c0-468c-b1ce-d4dd37f59612","Type":"ContainerDied","Data":"4abd8cfb129f382dd8827a8ac014d01e116bcdba0f51ae5374bea9a7372334e6"} Dec 08 09:45:06 crc kubenswrapper[4776]: I1208 09:45:06.338487 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhrr2" event={"ID":"8c64d647-91c0-468c-b1ce-d4dd37f59612","Type":"ContainerStarted","Data":"dcf16e832088abc14c1b22a2f6ecdc7fcf79f4396e8f57091f441938aa707f79"} Dec 08 09:45:06 crc kubenswrapper[4776]: I1208 09:45:06.364214 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06cf2358-4cba-4d69-81d1-dc02434fe460" path="/var/lib/kubelet/pods/06cf2358-4cba-4d69-81d1-dc02434fe460/volumes" Dec 08 09:45:06 crc kubenswrapper[4776]: I1208 09:45:06.368607 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hhrr2" podStartSLOduration=2.957501454 podStartE2EDuration="5.368585827s" podCreationTimestamp="2025-12-08 09:45:01 +0000 UTC" firstStartedPulling="2025-12-08 09:45:03.293079647 +0000 UTC m=+2779.556304669" lastFinishedPulling="2025-12-08 09:45:05.70416402 +0000 UTC m=+2781.967389042" observedRunningTime="2025-12-08 09:45:06.357393006 +0000 UTC m=+2782.620618018" watchObservedRunningTime="2025-12-08 09:45:06.368585827 +0000 UTC m=+2782.631810849" Dec 08 09:45:11 crc kubenswrapper[4776]: I1208 09:45:11.398644 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:45:11 crc kubenswrapper[4776]: I1208 09:45:11.399216 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:45:11 crc kubenswrapper[4776]: I1208 09:45:11.399255 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 09:45:11 crc kubenswrapper[4776]: I1208 09:45:11.399920 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"51d9d9755c704c9d9d8e04dd484f9ff425e26c39d52bf197718b780d0db4e339"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 09:45:11 crc kubenswrapper[4776]: I1208 09:45:11.399991 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://51d9d9755c704c9d9d8e04dd484f9ff425e26c39d52bf197718b780d0db4e339" gracePeriod=600 Dec 08 09:45:11 crc kubenswrapper[4776]: I1208 09:45:11.899656 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:11 crc kubenswrapper[4776]: I1208 09:45:11.900040 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:11 crc kubenswrapper[4776]: I1208 09:45:11.987277 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:12 crc kubenswrapper[4776]: I1208 09:45:12.404658 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="51d9d9755c704c9d9d8e04dd484f9ff425e26c39d52bf197718b780d0db4e339" exitCode=0 Dec 08 09:45:12 crc kubenswrapper[4776]: I1208 09:45:12.404740 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"51d9d9755c704c9d9d8e04dd484f9ff425e26c39d52bf197718b780d0db4e339"} Dec 08 09:45:12 crc kubenswrapper[4776]: I1208 09:45:12.405064 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b"} Dec 08 09:45:12 crc kubenswrapper[4776]: I1208 09:45:12.405083 4776 scope.go:117] "RemoveContainer" containerID="eb1b0e925baf44c975c69a0034d44e251745d7626b9fb33d2f97e1c293dbd296" Dec 08 09:45:12 crc kubenswrapper[4776]: I1208 09:45:12.462432 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:12 crc kubenswrapper[4776]: I1208 09:45:12.508612 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hhrr2"] Dec 08 09:45:14 crc kubenswrapper[4776]: I1208 09:45:14.429267 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hhrr2" podUID="8c64d647-91c0-468c-b1ce-d4dd37f59612" containerName="registry-server" containerID="cri-o://dcf16e832088abc14c1b22a2f6ecdc7fcf79f4396e8f57091f441938aa707f79" gracePeriod=2 Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.014340 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.190401 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5fdz\" (UniqueName: \"kubernetes.io/projected/8c64d647-91c0-468c-b1ce-d4dd37f59612-kube-api-access-d5fdz\") pod \"8c64d647-91c0-468c-b1ce-d4dd37f59612\" (UID: \"8c64d647-91c0-468c-b1ce-d4dd37f59612\") " Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.190746 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c64d647-91c0-468c-b1ce-d4dd37f59612-utilities\") pod \"8c64d647-91c0-468c-b1ce-d4dd37f59612\" (UID: \"8c64d647-91c0-468c-b1ce-d4dd37f59612\") " Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.190827 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c64d647-91c0-468c-b1ce-d4dd37f59612-catalog-content\") pod \"8c64d647-91c0-468c-b1ce-d4dd37f59612\" (UID: \"8c64d647-91c0-468c-b1ce-d4dd37f59612\") " Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.192400 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c64d647-91c0-468c-b1ce-d4dd37f59612-utilities" (OuterVolumeSpecName: "utilities") pod "8c64d647-91c0-468c-b1ce-d4dd37f59612" (UID: "8c64d647-91c0-468c-b1ce-d4dd37f59612"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.197559 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c64d647-91c0-468c-b1ce-d4dd37f59612-kube-api-access-d5fdz" (OuterVolumeSpecName: "kube-api-access-d5fdz") pod "8c64d647-91c0-468c-b1ce-d4dd37f59612" (UID: "8c64d647-91c0-468c-b1ce-d4dd37f59612"). InnerVolumeSpecName "kube-api-access-d5fdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.208599 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c64d647-91c0-468c-b1ce-d4dd37f59612-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c64d647-91c0-468c-b1ce-d4dd37f59612" (UID: "8c64d647-91c0-468c-b1ce-d4dd37f59612"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.294491 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c64d647-91c0-468c-b1ce-d4dd37f59612-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.294532 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5fdz\" (UniqueName: \"kubernetes.io/projected/8c64d647-91c0-468c-b1ce-d4dd37f59612-kube-api-access-d5fdz\") on node \"crc\" DevicePath \"\"" Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.294546 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c64d647-91c0-468c-b1ce-d4dd37f59612-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.440611 4776 generic.go:334] "Generic (PLEG): container finished" podID="8c64d647-91c0-468c-b1ce-d4dd37f59612" containerID="dcf16e832088abc14c1b22a2f6ecdc7fcf79f4396e8f57091f441938aa707f79" exitCode=0 Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.440659 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhrr2" event={"ID":"8c64d647-91c0-468c-b1ce-d4dd37f59612","Type":"ContainerDied","Data":"dcf16e832088abc14c1b22a2f6ecdc7fcf79f4396e8f57091f441938aa707f79"} Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.440710 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhrr2" event={"ID":"8c64d647-91c0-468c-b1ce-d4dd37f59612","Type":"ContainerDied","Data":"3374442ba08463c315fa3ae7d53d07f6d48d6a4b1ad9821353bee459985d8a2d"} Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.440719 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hhrr2" Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.440735 4776 scope.go:117] "RemoveContainer" containerID="dcf16e832088abc14c1b22a2f6ecdc7fcf79f4396e8f57091f441938aa707f79" Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.472126 4776 scope.go:117] "RemoveContainer" containerID="4abd8cfb129f382dd8827a8ac014d01e116bcdba0f51ae5374bea9a7372334e6" Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.485203 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hhrr2"] Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.494306 4776 scope.go:117] "RemoveContainer" containerID="a2e272ed8bf86ca0d5e4c5a297cf6b84a5a18837d2dc6085f44b67beb40a0dd7" Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.504751 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hhrr2"] Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.563267 4776 scope.go:117] "RemoveContainer" containerID="dcf16e832088abc14c1b22a2f6ecdc7fcf79f4396e8f57091f441938aa707f79" Dec 08 09:45:15 crc kubenswrapper[4776]: E1208 09:45:15.563724 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcf16e832088abc14c1b22a2f6ecdc7fcf79f4396e8f57091f441938aa707f79\": container with ID starting with dcf16e832088abc14c1b22a2f6ecdc7fcf79f4396e8f57091f441938aa707f79 not found: ID does not exist" containerID="dcf16e832088abc14c1b22a2f6ecdc7fcf79f4396e8f57091f441938aa707f79" Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.563767 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcf16e832088abc14c1b22a2f6ecdc7fcf79f4396e8f57091f441938aa707f79"} err="failed to get container status \"dcf16e832088abc14c1b22a2f6ecdc7fcf79f4396e8f57091f441938aa707f79\": rpc error: code = NotFound desc = could not find container \"dcf16e832088abc14c1b22a2f6ecdc7fcf79f4396e8f57091f441938aa707f79\": container with ID starting with dcf16e832088abc14c1b22a2f6ecdc7fcf79f4396e8f57091f441938aa707f79 not found: ID does not exist" Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.563791 4776 scope.go:117] "RemoveContainer" containerID="4abd8cfb129f382dd8827a8ac014d01e116bcdba0f51ae5374bea9a7372334e6" Dec 08 09:45:15 crc kubenswrapper[4776]: E1208 09:45:15.564142 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4abd8cfb129f382dd8827a8ac014d01e116bcdba0f51ae5374bea9a7372334e6\": container with ID starting with 4abd8cfb129f382dd8827a8ac014d01e116bcdba0f51ae5374bea9a7372334e6 not found: ID does not exist" containerID="4abd8cfb129f382dd8827a8ac014d01e116bcdba0f51ae5374bea9a7372334e6" Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.564186 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4abd8cfb129f382dd8827a8ac014d01e116bcdba0f51ae5374bea9a7372334e6"} err="failed to get container status \"4abd8cfb129f382dd8827a8ac014d01e116bcdba0f51ae5374bea9a7372334e6\": rpc error: code = NotFound desc = could not find container \"4abd8cfb129f382dd8827a8ac014d01e116bcdba0f51ae5374bea9a7372334e6\": container with ID starting with 4abd8cfb129f382dd8827a8ac014d01e116bcdba0f51ae5374bea9a7372334e6 not found: ID does not exist" Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.564214 4776 scope.go:117] "RemoveContainer" containerID="a2e272ed8bf86ca0d5e4c5a297cf6b84a5a18837d2dc6085f44b67beb40a0dd7" Dec 08 09:45:15 crc kubenswrapper[4776]: E1208 09:45:15.564569 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2e272ed8bf86ca0d5e4c5a297cf6b84a5a18837d2dc6085f44b67beb40a0dd7\": container with ID starting with a2e272ed8bf86ca0d5e4c5a297cf6b84a5a18837d2dc6085f44b67beb40a0dd7 not found: ID does not exist" containerID="a2e272ed8bf86ca0d5e4c5a297cf6b84a5a18837d2dc6085f44b67beb40a0dd7" Dec 08 09:45:15 crc kubenswrapper[4776]: I1208 09:45:15.564597 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e272ed8bf86ca0d5e4c5a297cf6b84a5a18837d2dc6085f44b67beb40a0dd7"} err="failed to get container status \"a2e272ed8bf86ca0d5e4c5a297cf6b84a5a18837d2dc6085f44b67beb40a0dd7\": rpc error: code = NotFound desc = could not find container \"a2e272ed8bf86ca0d5e4c5a297cf6b84a5a18837d2dc6085f44b67beb40a0dd7\": container with ID starting with a2e272ed8bf86ca0d5e4c5a297cf6b84a5a18837d2dc6085f44b67beb40a0dd7 not found: ID does not exist" Dec 08 09:45:16 crc kubenswrapper[4776]: I1208 09:45:16.356164 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c64d647-91c0-468c-b1ce-d4dd37f59612" path="/var/lib/kubelet/pods/8c64d647-91c0-468c-b1ce-d4dd37f59612/volumes" Dec 08 09:45:18 crc kubenswrapper[4776]: I1208 09:45:18.241020 4776 scope.go:117] "RemoveContainer" containerID="14510ab17084a5d159b27749b9a50aff2bde80a03fc6bf1ac4cb8f9804f42f25" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.495726 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-889xr"] Dec 08 09:47:03 crc kubenswrapper[4776]: E1208 09:47:03.496759 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c64d647-91c0-468c-b1ce-d4dd37f59612" containerName="registry-server" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.496775 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c64d647-91c0-468c-b1ce-d4dd37f59612" containerName="registry-server" Dec 08 09:47:03 crc kubenswrapper[4776]: E1208 09:47:03.496808 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c64d647-91c0-468c-b1ce-d4dd37f59612" containerName="extract-utilities" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.496816 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c64d647-91c0-468c-b1ce-d4dd37f59612" containerName="extract-utilities" Dec 08 09:47:03 crc kubenswrapper[4776]: E1208 09:47:03.496837 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c64d647-91c0-468c-b1ce-d4dd37f59612" containerName="extract-content" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.496845 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c64d647-91c0-468c-b1ce-d4dd37f59612" containerName="extract-content" Dec 08 09:47:03 crc kubenswrapper[4776]: E1208 09:47:03.496857 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ff7820-516e-4d30-9d51-e9a9c7582c81" containerName="collect-profiles" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.496866 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ff7820-516e-4d30-9d51-e9a9c7582c81" containerName="collect-profiles" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.497131 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ff7820-516e-4d30-9d51-e9a9c7582c81" containerName="collect-profiles" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.497194 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c64d647-91c0-468c-b1ce-d4dd37f59612" containerName="registry-server" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.499631 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.516080 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-889xr"] Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.591880 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd629dff-88dd-4d07-ba43-75dff0ca1e98-catalog-content\") pod \"community-operators-889xr\" (UID: \"fd629dff-88dd-4d07-ba43-75dff0ca1e98\") " pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.592014 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd629dff-88dd-4d07-ba43-75dff0ca1e98-utilities\") pod \"community-operators-889xr\" (UID: \"fd629dff-88dd-4d07-ba43-75dff0ca1e98\") " pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.592036 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gl9t\" (UniqueName: \"kubernetes.io/projected/fd629dff-88dd-4d07-ba43-75dff0ca1e98-kube-api-access-9gl9t\") pod \"community-operators-889xr\" (UID: \"fd629dff-88dd-4d07-ba43-75dff0ca1e98\") " pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.695283 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd629dff-88dd-4d07-ba43-75dff0ca1e98-utilities\") pod \"community-operators-889xr\" (UID: \"fd629dff-88dd-4d07-ba43-75dff0ca1e98\") " pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.695363 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gl9t\" (UniqueName: \"kubernetes.io/projected/fd629dff-88dd-4d07-ba43-75dff0ca1e98-kube-api-access-9gl9t\") pod \"community-operators-889xr\" (UID: \"fd629dff-88dd-4d07-ba43-75dff0ca1e98\") " pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.695687 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd629dff-88dd-4d07-ba43-75dff0ca1e98-catalog-content\") pod \"community-operators-889xr\" (UID: \"fd629dff-88dd-4d07-ba43-75dff0ca1e98\") " pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.696216 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd629dff-88dd-4d07-ba43-75dff0ca1e98-utilities\") pod \"community-operators-889xr\" (UID: \"fd629dff-88dd-4d07-ba43-75dff0ca1e98\") " pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.696364 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd629dff-88dd-4d07-ba43-75dff0ca1e98-catalog-content\") pod \"community-operators-889xr\" (UID: \"fd629dff-88dd-4d07-ba43-75dff0ca1e98\") " pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.716698 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gl9t\" (UniqueName: \"kubernetes.io/projected/fd629dff-88dd-4d07-ba43-75dff0ca1e98-kube-api-access-9gl9t\") pod \"community-operators-889xr\" (UID: \"fd629dff-88dd-4d07-ba43-75dff0ca1e98\") " pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:03 crc kubenswrapper[4776]: I1208 09:47:03.822396 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:04 crc kubenswrapper[4776]: I1208 09:47:04.398112 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-889xr"] Dec 08 09:47:04 crc kubenswrapper[4776]: I1208 09:47:04.584811 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-889xr" event={"ID":"fd629dff-88dd-4d07-ba43-75dff0ca1e98","Type":"ContainerStarted","Data":"345aaa2de20838e33128e365b749d5f5e3cf52e4618ab7a69dc73d63c1ea7c00"} Dec 08 09:47:05 crc kubenswrapper[4776]: I1208 09:47:05.597306 4776 generic.go:334] "Generic (PLEG): container finished" podID="fd629dff-88dd-4d07-ba43-75dff0ca1e98" containerID="dcebb34e283aa21725044bc8282bb6bee3bfb07ce4f4cd017cba23a54e0a8a02" exitCode=0 Dec 08 09:47:05 crc kubenswrapper[4776]: I1208 09:47:05.597416 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-889xr" event={"ID":"fd629dff-88dd-4d07-ba43-75dff0ca1e98","Type":"ContainerDied","Data":"dcebb34e283aa21725044bc8282bb6bee3bfb07ce4f4cd017cba23a54e0a8a02"} Dec 08 09:47:06 crc kubenswrapper[4776]: I1208 09:47:06.614147 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-889xr" event={"ID":"fd629dff-88dd-4d07-ba43-75dff0ca1e98","Type":"ContainerStarted","Data":"913a80c0672bc5519f10559e0e59370f2fc0eaa3f9a7fff8a52bd92de70634d8"} Dec 08 09:47:07 crc kubenswrapper[4776]: I1208 09:47:07.627504 4776 generic.go:334] "Generic (PLEG): container finished" podID="fd629dff-88dd-4d07-ba43-75dff0ca1e98" containerID="913a80c0672bc5519f10559e0e59370f2fc0eaa3f9a7fff8a52bd92de70634d8" exitCode=0 Dec 08 09:47:07 crc kubenswrapper[4776]: I1208 09:47:07.628023 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-889xr" event={"ID":"fd629dff-88dd-4d07-ba43-75dff0ca1e98","Type":"ContainerDied","Data":"913a80c0672bc5519f10559e0e59370f2fc0eaa3f9a7fff8a52bd92de70634d8"} Dec 08 09:47:08 crc kubenswrapper[4776]: I1208 09:47:08.651439 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-889xr" event={"ID":"fd629dff-88dd-4d07-ba43-75dff0ca1e98","Type":"ContainerStarted","Data":"6c381bad19775f7661a82fbc4d58b29e5649c0dd3a4bd783c713156c55c36754"} Dec 08 09:47:08 crc kubenswrapper[4776]: I1208 09:47:08.672587 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-889xr" podStartSLOduration=3.191378839 podStartE2EDuration="5.672568008s" podCreationTimestamp="2025-12-08 09:47:03 +0000 UTC" firstStartedPulling="2025-12-08 09:47:05.599825371 +0000 UTC m=+2901.863050393" lastFinishedPulling="2025-12-08 09:47:08.08101454 +0000 UTC m=+2904.344239562" observedRunningTime="2025-12-08 09:47:08.668640731 +0000 UTC m=+2904.931865763" watchObservedRunningTime="2025-12-08 09:47:08.672568008 +0000 UTC m=+2904.935793020" Dec 08 09:47:11 crc kubenswrapper[4776]: I1208 09:47:11.399547 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:47:11 crc kubenswrapper[4776]: I1208 09:47:11.400112 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:47:11 crc kubenswrapper[4776]: I1208 09:47:11.684152 4776 generic.go:334] "Generic (PLEG): container finished" podID="9cd841cc-611f-406b-b9d5-8c242c1321ba" containerID="1f9dfa4aec860bc6ebbc9451504213653443249096a55cbb253caf98eb110f8f" exitCode=0 Dec 08 09:47:11 crc kubenswrapper[4776]: I1208 09:47:11.684231 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" event={"ID":"9cd841cc-611f-406b-b9d5-8c242c1321ba","Type":"ContainerDied","Data":"1f9dfa4aec860bc6ebbc9451504213653443249096a55cbb253caf98eb110f8f"} Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.305651 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.425787 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-combined-ca-bundle\") pod \"9cd841cc-611f-406b-b9d5-8c242c1321ba\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.425876 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-migration-ssh-key-1\") pod \"9cd841cc-611f-406b-b9d5-8c242c1321ba\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.425905 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-cell1-compute-config-0\") pod \"9cd841cc-611f-406b-b9d5-8c242c1321ba\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.425942 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-migration-ssh-key-0\") pod \"9cd841cc-611f-406b-b9d5-8c242c1321ba\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.426110 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-ssh-key\") pod \"9cd841cc-611f-406b-b9d5-8c242c1321ba\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.426150 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z82s2\" (UniqueName: \"kubernetes.io/projected/9cd841cc-611f-406b-b9d5-8c242c1321ba-kube-api-access-z82s2\") pod \"9cd841cc-611f-406b-b9d5-8c242c1321ba\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.426220 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-extra-config-0\") pod \"9cd841cc-611f-406b-b9d5-8c242c1321ba\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.426364 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-cell1-compute-config-1\") pod \"9cd841cc-611f-406b-b9d5-8c242c1321ba\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.426405 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-inventory\") pod \"9cd841cc-611f-406b-b9d5-8c242c1321ba\" (UID: \"9cd841cc-611f-406b-b9d5-8c242c1321ba\") " Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.431759 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cd841cc-611f-406b-b9d5-8c242c1321ba-kube-api-access-z82s2" (OuterVolumeSpecName: "kube-api-access-z82s2") pod "9cd841cc-611f-406b-b9d5-8c242c1321ba" (UID: "9cd841cc-611f-406b-b9d5-8c242c1321ba"). InnerVolumeSpecName "kube-api-access-z82s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.462163 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9cd841cc-611f-406b-b9d5-8c242c1321ba" (UID: "9cd841cc-611f-406b-b9d5-8c242c1321ba"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.463680 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "9cd841cc-611f-406b-b9d5-8c242c1321ba" (UID: "9cd841cc-611f-406b-b9d5-8c242c1321ba"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.468148 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "9cd841cc-611f-406b-b9d5-8c242c1321ba" (UID: "9cd841cc-611f-406b-b9d5-8c242c1321ba"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.470692 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9cd841cc-611f-406b-b9d5-8c242c1321ba" (UID: "9cd841cc-611f-406b-b9d5-8c242c1321ba"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.472963 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "9cd841cc-611f-406b-b9d5-8c242c1321ba" (UID: "9cd841cc-611f-406b-b9d5-8c242c1321ba"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.480358 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-inventory" (OuterVolumeSpecName: "inventory") pod "9cd841cc-611f-406b-b9d5-8c242c1321ba" (UID: "9cd841cc-611f-406b-b9d5-8c242c1321ba"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.486903 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "9cd841cc-611f-406b-b9d5-8c242c1321ba" (UID: "9cd841cc-611f-406b-b9d5-8c242c1321ba"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.499232 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "9cd841cc-611f-406b-b9d5-8c242c1321ba" (UID: "9cd841cc-611f-406b-b9d5-8c242c1321ba"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.529382 4776 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.529428 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.529442 4776 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.529489 4776 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.529522 4776 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.529539 4776 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.529554 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9cd841cc-611f-406b-b9d5-8c242c1321ba-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.529570 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z82s2\" (UniqueName: \"kubernetes.io/projected/9cd841cc-611f-406b-b9d5-8c242c1321ba-kube-api-access-z82s2\") on node \"crc\" DevicePath \"\"" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.529584 4776 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9cd841cc-611f-406b-b9d5-8c242c1321ba-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.711130 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" event={"ID":"9cd841cc-611f-406b-b9d5-8c242c1321ba","Type":"ContainerDied","Data":"289c8ac893b0ccb0717b004ca3a7b9e32084d9423ed90bbce073f3072998ba5b"} Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.711472 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="289c8ac893b0ccb0717b004ca3a7b9e32084d9423ed90bbce073f3072998ba5b" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.711240 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tnlnp" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.822488 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.822545 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.824004 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm"] Dec 08 09:47:13 crc kubenswrapper[4776]: E1208 09:47:13.824623 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd841cc-611f-406b-b9d5-8c242c1321ba" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.824654 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd841cc-611f-406b-b9d5-8c242c1321ba" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.824975 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd841cc-611f-406b-b9d5-8c242c1321ba" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.826366 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.829429 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.830002 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.830050 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.830079 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.830251 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.836479 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm"] Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.893024 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.948279 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.948377 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.948409 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.948435 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kds7n\" (UniqueName: \"kubernetes.io/projected/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-kube-api-access-kds7n\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.948813 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.948884 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:13 crc kubenswrapper[4776]: I1208 09:47:13.949046 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.051586 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.051689 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.051751 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.051770 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.051790 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kds7n\" (UniqueName: \"kubernetes.io/projected/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-kube-api-access-kds7n\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.051843 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.051882 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.056104 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.057030 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.057921 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.058654 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.059783 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.066944 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.070034 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kds7n\" (UniqueName: \"kubernetes.io/projected/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-kube-api-access-kds7n\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-89ghm\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.174372 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:47:14 crc kubenswrapper[4776]: W1208 09:47:14.745145 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0b1960a_6fc8_4fd1_adb6_9e7b5fe42f0e.slice/crio-cf583dfedb7445f2cc913dc95561dfac1bf0a973576a9666fc20045a66f5f416 WatchSource:0}: Error finding container cf583dfedb7445f2cc913dc95561dfac1bf0a973576a9666fc20045a66f5f416: Status 404 returned error can't find the container with id cf583dfedb7445f2cc913dc95561dfac1bf0a973576a9666fc20045a66f5f416 Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.747575 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm"] Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.778262 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:14 crc kubenswrapper[4776]: I1208 09:47:14.838281 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-889xr"] Dec 08 09:47:15 crc kubenswrapper[4776]: I1208 09:47:15.731718 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" event={"ID":"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e","Type":"ContainerStarted","Data":"d62beab68b5bdbc0fde2b4a810b472ce682b470a107adefb332479bd0bda28c7"} Dec 08 09:47:15 crc kubenswrapper[4776]: I1208 09:47:15.732130 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" event={"ID":"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e","Type":"ContainerStarted","Data":"cf583dfedb7445f2cc913dc95561dfac1bf0a973576a9666fc20045a66f5f416"} Dec 08 09:47:15 crc kubenswrapper[4776]: I1208 09:47:15.757746 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" podStartSLOduration=2.404762938 podStartE2EDuration="2.757722267s" podCreationTimestamp="2025-12-08 09:47:13 +0000 UTC" firstStartedPulling="2025-12-08 09:47:14.750453749 +0000 UTC m=+2911.013678771" lastFinishedPulling="2025-12-08 09:47:15.103413078 +0000 UTC m=+2911.366638100" observedRunningTime="2025-12-08 09:47:15.750449141 +0000 UTC m=+2912.013674213" watchObservedRunningTime="2025-12-08 09:47:15.757722267 +0000 UTC m=+2912.020947309" Dec 08 09:47:16 crc kubenswrapper[4776]: I1208 09:47:16.754879 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-889xr" podUID="fd629dff-88dd-4d07-ba43-75dff0ca1e98" containerName="registry-server" containerID="cri-o://6c381bad19775f7661a82fbc4d58b29e5649c0dd3a4bd783c713156c55c36754" gracePeriod=2 Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.256189 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.431642 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd629dff-88dd-4d07-ba43-75dff0ca1e98-utilities\") pod \"fd629dff-88dd-4d07-ba43-75dff0ca1e98\" (UID: \"fd629dff-88dd-4d07-ba43-75dff0ca1e98\") " Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.432019 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd629dff-88dd-4d07-ba43-75dff0ca1e98-catalog-content\") pod \"fd629dff-88dd-4d07-ba43-75dff0ca1e98\" (UID: \"fd629dff-88dd-4d07-ba43-75dff0ca1e98\") " Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.432196 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gl9t\" (UniqueName: \"kubernetes.io/projected/fd629dff-88dd-4d07-ba43-75dff0ca1e98-kube-api-access-9gl9t\") pod \"fd629dff-88dd-4d07-ba43-75dff0ca1e98\" (UID: \"fd629dff-88dd-4d07-ba43-75dff0ca1e98\") " Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.432940 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd629dff-88dd-4d07-ba43-75dff0ca1e98-utilities" (OuterVolumeSpecName: "utilities") pod "fd629dff-88dd-4d07-ba43-75dff0ca1e98" (UID: "fd629dff-88dd-4d07-ba43-75dff0ca1e98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.441486 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd629dff-88dd-4d07-ba43-75dff0ca1e98-kube-api-access-9gl9t" (OuterVolumeSpecName: "kube-api-access-9gl9t") pod "fd629dff-88dd-4d07-ba43-75dff0ca1e98" (UID: "fd629dff-88dd-4d07-ba43-75dff0ca1e98"). InnerVolumeSpecName "kube-api-access-9gl9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.485337 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd629dff-88dd-4d07-ba43-75dff0ca1e98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd629dff-88dd-4d07-ba43-75dff0ca1e98" (UID: "fd629dff-88dd-4d07-ba43-75dff0ca1e98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.534644 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gl9t\" (UniqueName: \"kubernetes.io/projected/fd629dff-88dd-4d07-ba43-75dff0ca1e98-kube-api-access-9gl9t\") on node \"crc\" DevicePath \"\"" Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.534684 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd629dff-88dd-4d07-ba43-75dff0ca1e98-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.534694 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd629dff-88dd-4d07-ba43-75dff0ca1e98-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.765873 4776 generic.go:334] "Generic (PLEG): container finished" podID="fd629dff-88dd-4d07-ba43-75dff0ca1e98" containerID="6c381bad19775f7661a82fbc4d58b29e5649c0dd3a4bd783c713156c55c36754" exitCode=0 Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.765914 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-889xr" event={"ID":"fd629dff-88dd-4d07-ba43-75dff0ca1e98","Type":"ContainerDied","Data":"6c381bad19775f7661a82fbc4d58b29e5649c0dd3a4bd783c713156c55c36754"} Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.765946 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-889xr" event={"ID":"fd629dff-88dd-4d07-ba43-75dff0ca1e98","Type":"ContainerDied","Data":"345aaa2de20838e33128e365b749d5f5e3cf52e4618ab7a69dc73d63c1ea7c00"} Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.765949 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-889xr" Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.765966 4776 scope.go:117] "RemoveContainer" containerID="6c381bad19775f7661a82fbc4d58b29e5649c0dd3a4bd783c713156c55c36754" Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.800432 4776 scope.go:117] "RemoveContainer" containerID="913a80c0672bc5519f10559e0e59370f2fc0eaa3f9a7fff8a52bd92de70634d8" Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.801881 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-889xr"] Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.812682 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-889xr"] Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.823959 4776 scope.go:117] "RemoveContainer" containerID="dcebb34e283aa21725044bc8282bb6bee3bfb07ce4f4cd017cba23a54e0a8a02" Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.880239 4776 scope.go:117] "RemoveContainer" containerID="6c381bad19775f7661a82fbc4d58b29e5649c0dd3a4bd783c713156c55c36754" Dec 08 09:47:17 crc kubenswrapper[4776]: E1208 09:47:17.880752 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c381bad19775f7661a82fbc4d58b29e5649c0dd3a4bd783c713156c55c36754\": container with ID starting with 6c381bad19775f7661a82fbc4d58b29e5649c0dd3a4bd783c713156c55c36754 not found: ID does not exist" containerID="6c381bad19775f7661a82fbc4d58b29e5649c0dd3a4bd783c713156c55c36754" Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.880793 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c381bad19775f7661a82fbc4d58b29e5649c0dd3a4bd783c713156c55c36754"} err="failed to get container status \"6c381bad19775f7661a82fbc4d58b29e5649c0dd3a4bd783c713156c55c36754\": rpc error: code = NotFound desc = could not find container \"6c381bad19775f7661a82fbc4d58b29e5649c0dd3a4bd783c713156c55c36754\": container with ID starting with 6c381bad19775f7661a82fbc4d58b29e5649c0dd3a4bd783c713156c55c36754 not found: ID does not exist" Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.880818 4776 scope.go:117] "RemoveContainer" containerID="913a80c0672bc5519f10559e0e59370f2fc0eaa3f9a7fff8a52bd92de70634d8" Dec 08 09:47:17 crc kubenswrapper[4776]: E1208 09:47:17.881126 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"913a80c0672bc5519f10559e0e59370f2fc0eaa3f9a7fff8a52bd92de70634d8\": container with ID starting with 913a80c0672bc5519f10559e0e59370f2fc0eaa3f9a7fff8a52bd92de70634d8 not found: ID does not exist" containerID="913a80c0672bc5519f10559e0e59370f2fc0eaa3f9a7fff8a52bd92de70634d8" Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.881153 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913a80c0672bc5519f10559e0e59370f2fc0eaa3f9a7fff8a52bd92de70634d8"} err="failed to get container status \"913a80c0672bc5519f10559e0e59370f2fc0eaa3f9a7fff8a52bd92de70634d8\": rpc error: code = NotFound desc = could not find container \"913a80c0672bc5519f10559e0e59370f2fc0eaa3f9a7fff8a52bd92de70634d8\": container with ID starting with 913a80c0672bc5519f10559e0e59370f2fc0eaa3f9a7fff8a52bd92de70634d8 not found: ID does not exist" Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.881166 4776 scope.go:117] "RemoveContainer" containerID="dcebb34e283aa21725044bc8282bb6bee3bfb07ce4f4cd017cba23a54e0a8a02" Dec 08 09:47:17 crc kubenswrapper[4776]: E1208 09:47:17.881490 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcebb34e283aa21725044bc8282bb6bee3bfb07ce4f4cd017cba23a54e0a8a02\": container with ID starting with dcebb34e283aa21725044bc8282bb6bee3bfb07ce4f4cd017cba23a54e0a8a02 not found: ID does not exist" containerID="dcebb34e283aa21725044bc8282bb6bee3bfb07ce4f4cd017cba23a54e0a8a02" Dec 08 09:47:17 crc kubenswrapper[4776]: I1208 09:47:17.881539 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcebb34e283aa21725044bc8282bb6bee3bfb07ce4f4cd017cba23a54e0a8a02"} err="failed to get container status \"dcebb34e283aa21725044bc8282bb6bee3bfb07ce4f4cd017cba23a54e0a8a02\": rpc error: code = NotFound desc = could not find container \"dcebb34e283aa21725044bc8282bb6bee3bfb07ce4f4cd017cba23a54e0a8a02\": container with ID starting with dcebb34e283aa21725044bc8282bb6bee3bfb07ce4f4cd017cba23a54e0a8a02 not found: ID does not exist" Dec 08 09:47:18 crc kubenswrapper[4776]: I1208 09:47:18.357161 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd629dff-88dd-4d07-ba43-75dff0ca1e98" path="/var/lib/kubelet/pods/fd629dff-88dd-4d07-ba43-75dff0ca1e98/volumes" Dec 08 09:47:41 crc kubenswrapper[4776]: I1208 09:47:41.398767 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:47:41 crc kubenswrapper[4776]: I1208 09:47:41.399305 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.345771 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xdvtz"] Dec 08 09:47:55 crc kubenswrapper[4776]: E1208 09:47:55.346777 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd629dff-88dd-4d07-ba43-75dff0ca1e98" containerName="registry-server" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.346792 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd629dff-88dd-4d07-ba43-75dff0ca1e98" containerName="registry-server" Dec 08 09:47:55 crc kubenswrapper[4776]: E1208 09:47:55.346838 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd629dff-88dd-4d07-ba43-75dff0ca1e98" containerName="extract-utilities" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.346844 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd629dff-88dd-4d07-ba43-75dff0ca1e98" containerName="extract-utilities" Dec 08 09:47:55 crc kubenswrapper[4776]: E1208 09:47:55.346856 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd629dff-88dd-4d07-ba43-75dff0ca1e98" containerName="extract-content" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.346863 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd629dff-88dd-4d07-ba43-75dff0ca1e98" containerName="extract-content" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.347108 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd629dff-88dd-4d07-ba43-75dff0ca1e98" containerName="registry-server" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.348702 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.362760 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xdvtz"] Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.451682 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b163c8b2-3bbf-40e4-af9a-a9d706dde833-utilities\") pod \"certified-operators-xdvtz\" (UID: \"b163c8b2-3bbf-40e4-af9a-a9d706dde833\") " pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.451743 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f454r\" (UniqueName: \"kubernetes.io/projected/b163c8b2-3bbf-40e4-af9a-a9d706dde833-kube-api-access-f454r\") pod \"certified-operators-xdvtz\" (UID: \"b163c8b2-3bbf-40e4-af9a-a9d706dde833\") " pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.452142 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b163c8b2-3bbf-40e4-af9a-a9d706dde833-catalog-content\") pod \"certified-operators-xdvtz\" (UID: \"b163c8b2-3bbf-40e4-af9a-a9d706dde833\") " pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.549604 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w8kll"] Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.552080 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.554479 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b163c8b2-3bbf-40e4-af9a-a9d706dde833-catalog-content\") pod \"certified-operators-xdvtz\" (UID: \"b163c8b2-3bbf-40e4-af9a-a9d706dde833\") " pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.554820 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b163c8b2-3bbf-40e4-af9a-a9d706dde833-utilities\") pod \"certified-operators-xdvtz\" (UID: \"b163c8b2-3bbf-40e4-af9a-a9d706dde833\") " pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.554852 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f454r\" (UniqueName: \"kubernetes.io/projected/b163c8b2-3bbf-40e4-af9a-a9d706dde833-kube-api-access-f454r\") pod \"certified-operators-xdvtz\" (UID: \"b163c8b2-3bbf-40e4-af9a-a9d706dde833\") " pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.555496 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b163c8b2-3bbf-40e4-af9a-a9d706dde833-catalog-content\") pod \"certified-operators-xdvtz\" (UID: \"b163c8b2-3bbf-40e4-af9a-a9d706dde833\") " pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.555598 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b163c8b2-3bbf-40e4-af9a-a9d706dde833-utilities\") pod \"certified-operators-xdvtz\" (UID: \"b163c8b2-3bbf-40e4-af9a-a9d706dde833\") " pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.572435 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w8kll"] Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.599189 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f454r\" (UniqueName: \"kubernetes.io/projected/b163c8b2-3bbf-40e4-af9a-a9d706dde833-kube-api-access-f454r\") pod \"certified-operators-xdvtz\" (UID: \"b163c8b2-3bbf-40e4-af9a-a9d706dde833\") " pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.656969 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-catalog-content\") pod \"redhat-operators-w8kll\" (UID: \"4e11a91f-79d9-4a60-a644-3581ab9ce0f2\") " pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.657235 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w52gr\" (UniqueName: \"kubernetes.io/projected/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-kube-api-access-w52gr\") pod \"redhat-operators-w8kll\" (UID: \"4e11a91f-79d9-4a60-a644-3581ab9ce0f2\") " pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.657314 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-utilities\") pod \"redhat-operators-w8kll\" (UID: \"4e11a91f-79d9-4a60-a644-3581ab9ce0f2\") " pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.674814 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.760119 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w52gr\" (UniqueName: \"kubernetes.io/projected/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-kube-api-access-w52gr\") pod \"redhat-operators-w8kll\" (UID: \"4e11a91f-79d9-4a60-a644-3581ab9ce0f2\") " pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.760481 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-utilities\") pod \"redhat-operators-w8kll\" (UID: \"4e11a91f-79d9-4a60-a644-3581ab9ce0f2\") " pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.760984 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-utilities\") pod \"redhat-operators-w8kll\" (UID: \"4e11a91f-79d9-4a60-a644-3581ab9ce0f2\") " pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.761323 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-catalog-content\") pod \"redhat-operators-w8kll\" (UID: \"4e11a91f-79d9-4a60-a644-3581ab9ce0f2\") " pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.761649 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-catalog-content\") pod \"redhat-operators-w8kll\" (UID: \"4e11a91f-79d9-4a60-a644-3581ab9ce0f2\") " pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.782193 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w52gr\" (UniqueName: \"kubernetes.io/projected/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-kube-api-access-w52gr\") pod \"redhat-operators-w8kll\" (UID: \"4e11a91f-79d9-4a60-a644-3581ab9ce0f2\") " pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:47:55 crc kubenswrapper[4776]: I1208 09:47:55.874966 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:47:56 crc kubenswrapper[4776]: I1208 09:47:56.276161 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xdvtz"] Dec 08 09:47:56 crc kubenswrapper[4776]: I1208 09:47:56.522951 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w8kll"] Dec 08 09:47:57 crc kubenswrapper[4776]: I1208 09:47:57.164768 4776 generic.go:334] "Generic (PLEG): container finished" podID="b163c8b2-3bbf-40e4-af9a-a9d706dde833" containerID="606cfd633dcff90b3427cf0e53fda7e49f56715a3eba1233ea88556b11fe9542" exitCode=0 Dec 08 09:47:57 crc kubenswrapper[4776]: I1208 09:47:57.164811 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdvtz" event={"ID":"b163c8b2-3bbf-40e4-af9a-a9d706dde833","Type":"ContainerDied","Data":"606cfd633dcff90b3427cf0e53fda7e49f56715a3eba1233ea88556b11fe9542"} Dec 08 09:47:57 crc kubenswrapper[4776]: I1208 09:47:57.165137 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdvtz" event={"ID":"b163c8b2-3bbf-40e4-af9a-a9d706dde833","Type":"ContainerStarted","Data":"b3427a7d79f0b28f7b1abe749a44a7c8266de547cb7f9a45f88a049b18556801"} Dec 08 09:47:57 crc kubenswrapper[4776]: I1208 09:47:57.167297 4776 generic.go:334] "Generic (PLEG): container finished" podID="4e11a91f-79d9-4a60-a644-3581ab9ce0f2" containerID="ee6dcc1844ccb75d1c023aa77879e7271c2ac8c33809482df47993f11d6bc11a" exitCode=0 Dec 08 09:47:57 crc kubenswrapper[4776]: I1208 09:47:57.167332 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8kll" event={"ID":"4e11a91f-79d9-4a60-a644-3581ab9ce0f2","Type":"ContainerDied","Data":"ee6dcc1844ccb75d1c023aa77879e7271c2ac8c33809482df47993f11d6bc11a"} Dec 08 09:47:57 crc kubenswrapper[4776]: I1208 09:47:57.167355 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8kll" event={"ID":"4e11a91f-79d9-4a60-a644-3581ab9ce0f2","Type":"ContainerStarted","Data":"f0a331f589f03dfb581a28d744536c714ddfbde82fa4a0db203de7d612639da9"} Dec 08 09:47:58 crc kubenswrapper[4776]: I1208 09:47:58.181921 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8kll" event={"ID":"4e11a91f-79d9-4a60-a644-3581ab9ce0f2","Type":"ContainerStarted","Data":"4c662c3468eb6bb8c8459e7017dbc76b3287e0ff462cd25c6ca3f14d4a3488a7"} Dec 08 09:47:59 crc kubenswrapper[4776]: I1208 09:47:59.196619 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdvtz" event={"ID":"b163c8b2-3bbf-40e4-af9a-a9d706dde833","Type":"ContainerStarted","Data":"b425e284790eb07fa150f7d84bad0128f2d85c19531e06f5ab7918a40d1f8eae"} Dec 08 09:48:01 crc kubenswrapper[4776]: I1208 09:48:01.248392 4776 generic.go:334] "Generic (PLEG): container finished" podID="b163c8b2-3bbf-40e4-af9a-a9d706dde833" containerID="b425e284790eb07fa150f7d84bad0128f2d85c19531e06f5ab7918a40d1f8eae" exitCode=0 Dec 08 09:48:01 crc kubenswrapper[4776]: I1208 09:48:01.249977 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdvtz" event={"ID":"b163c8b2-3bbf-40e4-af9a-a9d706dde833","Type":"ContainerDied","Data":"b425e284790eb07fa150f7d84bad0128f2d85c19531e06f5ab7918a40d1f8eae"} Dec 08 09:48:02 crc kubenswrapper[4776]: I1208 09:48:02.267267 4776 generic.go:334] "Generic (PLEG): container finished" podID="4e11a91f-79d9-4a60-a644-3581ab9ce0f2" containerID="4c662c3468eb6bb8c8459e7017dbc76b3287e0ff462cd25c6ca3f14d4a3488a7" exitCode=0 Dec 08 09:48:02 crc kubenswrapper[4776]: I1208 09:48:02.267588 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8kll" event={"ID":"4e11a91f-79d9-4a60-a644-3581ab9ce0f2","Type":"ContainerDied","Data":"4c662c3468eb6bb8c8459e7017dbc76b3287e0ff462cd25c6ca3f14d4a3488a7"} Dec 08 09:48:03 crc kubenswrapper[4776]: I1208 09:48:03.281476 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8kll" event={"ID":"4e11a91f-79d9-4a60-a644-3581ab9ce0f2","Type":"ContainerStarted","Data":"ee9916d1d4ac1dbe3acce747e9f2521e031f4714c5777e78902f73c89c17ed3b"} Dec 08 09:48:03 crc kubenswrapper[4776]: I1208 09:48:03.284515 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdvtz" event={"ID":"b163c8b2-3bbf-40e4-af9a-a9d706dde833","Type":"ContainerStarted","Data":"4f3319f799275c94d24eeb04e473663b2211db2e217947e3f42e3ec37e4d1f61"} Dec 08 09:48:03 crc kubenswrapper[4776]: I1208 09:48:03.327897 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w8kll" podStartSLOduration=2.691454705 podStartE2EDuration="8.327878122s" podCreationTimestamp="2025-12-08 09:47:55 +0000 UTC" firstStartedPulling="2025-12-08 09:47:57.169109731 +0000 UTC m=+2953.432334753" lastFinishedPulling="2025-12-08 09:48:02.805533148 +0000 UTC m=+2959.068758170" observedRunningTime="2025-12-08 09:48:03.301795099 +0000 UTC m=+2959.565020141" watchObservedRunningTime="2025-12-08 09:48:03.327878122 +0000 UTC m=+2959.591103144" Dec 08 09:48:03 crc kubenswrapper[4776]: I1208 09:48:03.346033 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xdvtz" podStartSLOduration=2.821744194 podStartE2EDuration="8.34601672s" podCreationTimestamp="2025-12-08 09:47:55 +0000 UTC" firstStartedPulling="2025-12-08 09:47:57.168584437 +0000 UTC m=+2953.431809459" lastFinishedPulling="2025-12-08 09:48:02.692856963 +0000 UTC m=+2958.956081985" observedRunningTime="2025-12-08 09:48:03.32446597 +0000 UTC m=+2959.587691002" watchObservedRunningTime="2025-12-08 09:48:03.34601672 +0000 UTC m=+2959.609241742" Dec 08 09:48:05 crc kubenswrapper[4776]: I1208 09:48:05.675055 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:48:05 crc kubenswrapper[4776]: I1208 09:48:05.676076 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:48:05 crc kubenswrapper[4776]: I1208 09:48:05.725549 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:48:05 crc kubenswrapper[4776]: I1208 09:48:05.876482 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:48:05 crc kubenswrapper[4776]: I1208 09:48:05.876749 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:48:06 crc kubenswrapper[4776]: I1208 09:48:06.922136 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w8kll" podUID="4e11a91f-79d9-4a60-a644-3581ab9ce0f2" containerName="registry-server" probeResult="failure" output=< Dec 08 09:48:06 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 09:48:06 crc kubenswrapper[4776]: > Dec 08 09:48:11 crc kubenswrapper[4776]: I1208 09:48:11.399210 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:48:11 crc kubenswrapper[4776]: I1208 09:48:11.399846 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:48:11 crc kubenswrapper[4776]: I1208 09:48:11.399898 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 09:48:11 crc kubenswrapper[4776]: I1208 09:48:11.400877 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 09:48:11 crc kubenswrapper[4776]: I1208 09:48:11.400943 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" gracePeriod=600 Dec 08 09:48:11 crc kubenswrapper[4776]: E1208 09:48:11.527035 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:48:12 crc kubenswrapper[4776]: I1208 09:48:12.394001 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" exitCode=0 Dec 08 09:48:12 crc kubenswrapper[4776]: I1208 09:48:12.394043 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b"} Dec 08 09:48:12 crc kubenswrapper[4776]: I1208 09:48:12.394077 4776 scope.go:117] "RemoveContainer" containerID="51d9d9755c704c9d9d8e04dd484f9ff425e26c39d52bf197718b780d0db4e339" Dec 08 09:48:12 crc kubenswrapper[4776]: I1208 09:48:12.395025 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:48:12 crc kubenswrapper[4776]: E1208 09:48:12.395389 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:48:15 crc kubenswrapper[4776]: I1208 09:48:15.726118 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:48:15 crc kubenswrapper[4776]: I1208 09:48:15.781971 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xdvtz"] Dec 08 09:48:15 crc kubenswrapper[4776]: I1208 09:48:15.933165 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:48:15 crc kubenswrapper[4776]: I1208 09:48:15.984629 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:48:16 crc kubenswrapper[4776]: I1208 09:48:16.437978 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xdvtz" podUID="b163c8b2-3bbf-40e4-af9a-a9d706dde833" containerName="registry-server" containerID="cri-o://4f3319f799275c94d24eeb04e473663b2211db2e217947e3f42e3ec37e4d1f61" gracePeriod=2 Dec 08 09:48:16 crc kubenswrapper[4776]: I1208 09:48:16.960639 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.081309 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f454r\" (UniqueName: \"kubernetes.io/projected/b163c8b2-3bbf-40e4-af9a-a9d706dde833-kube-api-access-f454r\") pod \"b163c8b2-3bbf-40e4-af9a-a9d706dde833\" (UID: \"b163c8b2-3bbf-40e4-af9a-a9d706dde833\") " Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.081835 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b163c8b2-3bbf-40e4-af9a-a9d706dde833-catalog-content\") pod \"b163c8b2-3bbf-40e4-af9a-a9d706dde833\" (UID: \"b163c8b2-3bbf-40e4-af9a-a9d706dde833\") " Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.081905 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b163c8b2-3bbf-40e4-af9a-a9d706dde833-utilities\") pod \"b163c8b2-3bbf-40e4-af9a-a9d706dde833\" (UID: \"b163c8b2-3bbf-40e4-af9a-a9d706dde833\") " Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.082944 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b163c8b2-3bbf-40e4-af9a-a9d706dde833-utilities" (OuterVolumeSpecName: "utilities") pod "b163c8b2-3bbf-40e4-af9a-a9d706dde833" (UID: "b163c8b2-3bbf-40e4-af9a-a9d706dde833"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.083839 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b163c8b2-3bbf-40e4-af9a-a9d706dde833-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.095518 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b163c8b2-3bbf-40e4-af9a-a9d706dde833-kube-api-access-f454r" (OuterVolumeSpecName: "kube-api-access-f454r") pod "b163c8b2-3bbf-40e4-af9a-a9d706dde833" (UID: "b163c8b2-3bbf-40e4-af9a-a9d706dde833"). InnerVolumeSpecName "kube-api-access-f454r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.123251 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b163c8b2-3bbf-40e4-af9a-a9d706dde833-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b163c8b2-3bbf-40e4-af9a-a9d706dde833" (UID: "b163c8b2-3bbf-40e4-af9a-a9d706dde833"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.185760 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f454r\" (UniqueName: \"kubernetes.io/projected/b163c8b2-3bbf-40e4-af9a-a9d706dde833-kube-api-access-f454r\") on node \"crc\" DevicePath \"\"" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.185791 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b163c8b2-3bbf-40e4-af9a-a9d706dde833-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.452345 4776 generic.go:334] "Generic (PLEG): container finished" podID="b163c8b2-3bbf-40e4-af9a-a9d706dde833" containerID="4f3319f799275c94d24eeb04e473663b2211db2e217947e3f42e3ec37e4d1f61" exitCode=0 Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.452401 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdvtz" event={"ID":"b163c8b2-3bbf-40e4-af9a-a9d706dde833","Type":"ContainerDied","Data":"4f3319f799275c94d24eeb04e473663b2211db2e217947e3f42e3ec37e4d1f61"} Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.452432 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdvtz" event={"ID":"b163c8b2-3bbf-40e4-af9a-a9d706dde833","Type":"ContainerDied","Data":"b3427a7d79f0b28f7b1abe749a44a7c8266de547cb7f9a45f88a049b18556801"} Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.452449 4776 scope.go:117] "RemoveContainer" containerID="4f3319f799275c94d24eeb04e473663b2211db2e217947e3f42e3ec37e4d1f61" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.452479 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdvtz" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.482475 4776 scope.go:117] "RemoveContainer" containerID="b425e284790eb07fa150f7d84bad0128f2d85c19531e06f5ab7918a40d1f8eae" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.503822 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xdvtz"] Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.525402 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xdvtz"] Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.532692 4776 scope.go:117] "RemoveContainer" containerID="606cfd633dcff90b3427cf0e53fda7e49f56715a3eba1233ea88556b11fe9542" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.565136 4776 scope.go:117] "RemoveContainer" containerID="4f3319f799275c94d24eeb04e473663b2211db2e217947e3f42e3ec37e4d1f61" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.565540 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w8kll"] Dec 08 09:48:17 crc kubenswrapper[4776]: E1208 09:48:17.565633 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f3319f799275c94d24eeb04e473663b2211db2e217947e3f42e3ec37e4d1f61\": container with ID starting with 4f3319f799275c94d24eeb04e473663b2211db2e217947e3f42e3ec37e4d1f61 not found: ID does not exist" containerID="4f3319f799275c94d24eeb04e473663b2211db2e217947e3f42e3ec37e4d1f61" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.565676 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f3319f799275c94d24eeb04e473663b2211db2e217947e3f42e3ec37e4d1f61"} err="failed to get container status \"4f3319f799275c94d24eeb04e473663b2211db2e217947e3f42e3ec37e4d1f61\": rpc error: code = NotFound desc = could not find container \"4f3319f799275c94d24eeb04e473663b2211db2e217947e3f42e3ec37e4d1f61\": container with ID starting with 4f3319f799275c94d24eeb04e473663b2211db2e217947e3f42e3ec37e4d1f61 not found: ID does not exist" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.565702 4776 scope.go:117] "RemoveContainer" containerID="b425e284790eb07fa150f7d84bad0128f2d85c19531e06f5ab7918a40d1f8eae" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.565769 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w8kll" podUID="4e11a91f-79d9-4a60-a644-3581ab9ce0f2" containerName="registry-server" containerID="cri-o://ee9916d1d4ac1dbe3acce747e9f2521e031f4714c5777e78902f73c89c17ed3b" gracePeriod=2 Dec 08 09:48:17 crc kubenswrapper[4776]: E1208 09:48:17.565929 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b425e284790eb07fa150f7d84bad0128f2d85c19531e06f5ab7918a40d1f8eae\": container with ID starting with b425e284790eb07fa150f7d84bad0128f2d85c19531e06f5ab7918a40d1f8eae not found: ID does not exist" containerID="b425e284790eb07fa150f7d84bad0128f2d85c19531e06f5ab7918a40d1f8eae" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.565955 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b425e284790eb07fa150f7d84bad0128f2d85c19531e06f5ab7918a40d1f8eae"} err="failed to get container status \"b425e284790eb07fa150f7d84bad0128f2d85c19531e06f5ab7918a40d1f8eae\": rpc error: code = NotFound desc = could not find container \"b425e284790eb07fa150f7d84bad0128f2d85c19531e06f5ab7918a40d1f8eae\": container with ID starting with b425e284790eb07fa150f7d84bad0128f2d85c19531e06f5ab7918a40d1f8eae not found: ID does not exist" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.565973 4776 scope.go:117] "RemoveContainer" containerID="606cfd633dcff90b3427cf0e53fda7e49f56715a3eba1233ea88556b11fe9542" Dec 08 09:48:17 crc kubenswrapper[4776]: E1208 09:48:17.566156 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606cfd633dcff90b3427cf0e53fda7e49f56715a3eba1233ea88556b11fe9542\": container with ID starting with 606cfd633dcff90b3427cf0e53fda7e49f56715a3eba1233ea88556b11fe9542 not found: ID does not exist" containerID="606cfd633dcff90b3427cf0e53fda7e49f56715a3eba1233ea88556b11fe9542" Dec 08 09:48:17 crc kubenswrapper[4776]: I1208 09:48:17.566285 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606cfd633dcff90b3427cf0e53fda7e49f56715a3eba1233ea88556b11fe9542"} err="failed to get container status \"606cfd633dcff90b3427cf0e53fda7e49f56715a3eba1233ea88556b11fe9542\": rpc error: code = NotFound desc = could not find container \"606cfd633dcff90b3427cf0e53fda7e49f56715a3eba1233ea88556b11fe9542\": container with ID starting with 606cfd633dcff90b3427cf0e53fda7e49f56715a3eba1233ea88556b11fe9542 not found: ID does not exist" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.081056 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.211774 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-catalog-content\") pod \"4e11a91f-79d9-4a60-a644-3581ab9ce0f2\" (UID: \"4e11a91f-79d9-4a60-a644-3581ab9ce0f2\") " Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.211859 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-utilities\") pod \"4e11a91f-79d9-4a60-a644-3581ab9ce0f2\" (UID: \"4e11a91f-79d9-4a60-a644-3581ab9ce0f2\") " Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.211959 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w52gr\" (UniqueName: \"kubernetes.io/projected/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-kube-api-access-w52gr\") pod \"4e11a91f-79d9-4a60-a644-3581ab9ce0f2\" (UID: \"4e11a91f-79d9-4a60-a644-3581ab9ce0f2\") " Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.212911 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-utilities" (OuterVolumeSpecName: "utilities") pod "4e11a91f-79d9-4a60-a644-3581ab9ce0f2" (UID: "4e11a91f-79d9-4a60-a644-3581ab9ce0f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.220941 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-kube-api-access-w52gr" (OuterVolumeSpecName: "kube-api-access-w52gr") pod "4e11a91f-79d9-4a60-a644-3581ab9ce0f2" (UID: "4e11a91f-79d9-4a60-a644-3581ab9ce0f2"). InnerVolumeSpecName "kube-api-access-w52gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.314638 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.314717 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w52gr\" (UniqueName: \"kubernetes.io/projected/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-kube-api-access-w52gr\") on node \"crc\" DevicePath \"\"" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.330666 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e11a91f-79d9-4a60-a644-3581ab9ce0f2" (UID: "4e11a91f-79d9-4a60-a644-3581ab9ce0f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.359351 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b163c8b2-3bbf-40e4-af9a-a9d706dde833" path="/var/lib/kubelet/pods/b163c8b2-3bbf-40e4-af9a-a9d706dde833/volumes" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.416833 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e11a91f-79d9-4a60-a644-3581ab9ce0f2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.469399 4776 generic.go:334] "Generic (PLEG): container finished" podID="4e11a91f-79d9-4a60-a644-3581ab9ce0f2" containerID="ee9916d1d4ac1dbe3acce747e9f2521e031f4714c5777e78902f73c89c17ed3b" exitCode=0 Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.469467 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8kll" event={"ID":"4e11a91f-79d9-4a60-a644-3581ab9ce0f2","Type":"ContainerDied","Data":"ee9916d1d4ac1dbe3acce747e9f2521e031f4714c5777e78902f73c89c17ed3b"} Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.469508 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8kll" event={"ID":"4e11a91f-79d9-4a60-a644-3581ab9ce0f2","Type":"ContainerDied","Data":"f0a331f589f03dfb581a28d744536c714ddfbde82fa4a0db203de7d612639da9"} Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.469526 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8kll" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.469538 4776 scope.go:117] "RemoveContainer" containerID="ee9916d1d4ac1dbe3acce747e9f2521e031f4714c5777e78902f73c89c17ed3b" Dec 08 09:48:18 crc kubenswrapper[4776]: E1208 09:48:18.476749 4776 kuberuntime_gc.go:389] "Failed to remove container log dead symlink" err="remove /var/log/containers/redhat-operators-w8kll_openshift-marketplace_registry-server-ee9916d1d4ac1dbe3acce747e9f2521e031f4714c5777e78902f73c89c17ed3b.log: no such file or directory" path="/var/log/containers/redhat-operators-w8kll_openshift-marketplace_registry-server-ee9916d1d4ac1dbe3acce747e9f2521e031f4714c5777e78902f73c89c17ed3b.log" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.502955 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w8kll"] Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.511444 4776 scope.go:117] "RemoveContainer" containerID="4c662c3468eb6bb8c8459e7017dbc76b3287e0ff462cd25c6ca3f14d4a3488a7" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.519513 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w8kll"] Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.537590 4776 scope.go:117] "RemoveContainer" containerID="ee6dcc1844ccb75d1c023aa77879e7271c2ac8c33809482df47993f11d6bc11a" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.560226 4776 scope.go:117] "RemoveContainer" containerID="ee9916d1d4ac1dbe3acce747e9f2521e031f4714c5777e78902f73c89c17ed3b" Dec 08 09:48:18 crc kubenswrapper[4776]: E1208 09:48:18.560621 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee9916d1d4ac1dbe3acce747e9f2521e031f4714c5777e78902f73c89c17ed3b\": container with ID starting with ee9916d1d4ac1dbe3acce747e9f2521e031f4714c5777e78902f73c89c17ed3b not found: ID does not exist" containerID="ee9916d1d4ac1dbe3acce747e9f2521e031f4714c5777e78902f73c89c17ed3b" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.560651 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee9916d1d4ac1dbe3acce747e9f2521e031f4714c5777e78902f73c89c17ed3b"} err="failed to get container status \"ee9916d1d4ac1dbe3acce747e9f2521e031f4714c5777e78902f73c89c17ed3b\": rpc error: code = NotFound desc = could not find container \"ee9916d1d4ac1dbe3acce747e9f2521e031f4714c5777e78902f73c89c17ed3b\": container with ID starting with ee9916d1d4ac1dbe3acce747e9f2521e031f4714c5777e78902f73c89c17ed3b not found: ID does not exist" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.560673 4776 scope.go:117] "RemoveContainer" containerID="4c662c3468eb6bb8c8459e7017dbc76b3287e0ff462cd25c6ca3f14d4a3488a7" Dec 08 09:48:18 crc kubenswrapper[4776]: E1208 09:48:18.560973 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c662c3468eb6bb8c8459e7017dbc76b3287e0ff462cd25c6ca3f14d4a3488a7\": container with ID starting with 4c662c3468eb6bb8c8459e7017dbc76b3287e0ff462cd25c6ca3f14d4a3488a7 not found: ID does not exist" containerID="4c662c3468eb6bb8c8459e7017dbc76b3287e0ff462cd25c6ca3f14d4a3488a7" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.561006 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c662c3468eb6bb8c8459e7017dbc76b3287e0ff462cd25c6ca3f14d4a3488a7"} err="failed to get container status \"4c662c3468eb6bb8c8459e7017dbc76b3287e0ff462cd25c6ca3f14d4a3488a7\": rpc error: code = NotFound desc = could not find container \"4c662c3468eb6bb8c8459e7017dbc76b3287e0ff462cd25c6ca3f14d4a3488a7\": container with ID starting with 4c662c3468eb6bb8c8459e7017dbc76b3287e0ff462cd25c6ca3f14d4a3488a7 not found: ID does not exist" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.561019 4776 scope.go:117] "RemoveContainer" containerID="ee6dcc1844ccb75d1c023aa77879e7271c2ac8c33809482df47993f11d6bc11a" Dec 08 09:48:18 crc kubenswrapper[4776]: E1208 09:48:18.561234 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee6dcc1844ccb75d1c023aa77879e7271c2ac8c33809482df47993f11d6bc11a\": container with ID starting with ee6dcc1844ccb75d1c023aa77879e7271c2ac8c33809482df47993f11d6bc11a not found: ID does not exist" containerID="ee6dcc1844ccb75d1c023aa77879e7271c2ac8c33809482df47993f11d6bc11a" Dec 08 09:48:18 crc kubenswrapper[4776]: I1208 09:48:18.561249 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee6dcc1844ccb75d1c023aa77879e7271c2ac8c33809482df47993f11d6bc11a"} err="failed to get container status \"ee6dcc1844ccb75d1c023aa77879e7271c2ac8c33809482df47993f11d6bc11a\": rpc error: code = NotFound desc = could not find container \"ee6dcc1844ccb75d1c023aa77879e7271c2ac8c33809482df47993f11d6bc11a\": container with ID starting with ee6dcc1844ccb75d1c023aa77879e7271c2ac8c33809482df47993f11d6bc11a not found: ID does not exist" Dec 08 09:48:20 crc kubenswrapper[4776]: I1208 09:48:20.357525 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e11a91f-79d9-4a60-a644-3581ab9ce0f2" path="/var/lib/kubelet/pods/4e11a91f-79d9-4a60-a644-3581ab9ce0f2/volumes" Dec 08 09:48:23 crc kubenswrapper[4776]: I1208 09:48:23.343645 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:48:23 crc kubenswrapper[4776]: E1208 09:48:23.344496 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:48:37 crc kubenswrapper[4776]: I1208 09:48:37.345232 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:48:37 crc kubenswrapper[4776]: E1208 09:48:37.346191 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:48:48 crc kubenswrapper[4776]: I1208 09:48:48.344790 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:48:48 crc kubenswrapper[4776]: E1208 09:48:48.346593 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:48:59 crc kubenswrapper[4776]: I1208 09:48:59.344361 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:48:59 crc kubenswrapper[4776]: E1208 09:48:59.345135 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:49:13 crc kubenswrapper[4776]: I1208 09:49:13.344411 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:49:13 crc kubenswrapper[4776]: E1208 09:49:13.345423 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:49:24 crc kubenswrapper[4776]: I1208 09:49:24.351549 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:49:24 crc kubenswrapper[4776]: E1208 09:49:24.352283 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:49:25 crc kubenswrapper[4776]: I1208 09:49:25.148645 4776 generic.go:334] "Generic (PLEG): container finished" podID="b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e" containerID="d62beab68b5bdbc0fde2b4a810b472ce682b470a107adefb332479bd0bda28c7" exitCode=0 Dec 08 09:49:25 crc kubenswrapper[4776]: I1208 09:49:25.148844 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" event={"ID":"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e","Type":"ContainerDied","Data":"d62beab68b5bdbc0fde2b4a810b472ce682b470a107adefb332479bd0bda28c7"} Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.636343 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.824835 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-2\") pod \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.824938 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ssh-key\") pod \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.825076 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-telemetry-combined-ca-bundle\") pod \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.825131 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-inventory\") pod \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.825205 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-1\") pod \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.825245 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-0\") pod \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.825277 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kds7n\" (UniqueName: \"kubernetes.io/projected/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-kube-api-access-kds7n\") pod \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\" (UID: \"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e\") " Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.830859 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e" (UID: "b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.844870 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-kube-api-access-kds7n" (OuterVolumeSpecName: "kube-api-access-kds7n") pod "b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e" (UID: "b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e"). InnerVolumeSpecName "kube-api-access-kds7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.863454 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e" (UID: "b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.863856 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-inventory" (OuterVolumeSpecName: "inventory") pod "b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e" (UID: "b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.866024 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e" (UID: "b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.883910 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e" (UID: "b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.887456 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e" (UID: "b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.927872 4776 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.927903 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.927915 4776 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.927925 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.927934 4776 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.927942 4776 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:49:26 crc kubenswrapper[4776]: I1208 09:49:26.927950 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kds7n\" (UniqueName: \"kubernetes.io/projected/b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e-kube-api-access-kds7n\") on node \"crc\" DevicePath \"\"" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.171243 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" event={"ID":"b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e","Type":"ContainerDied","Data":"cf583dfedb7445f2cc913dc95561dfac1bf0a973576a9666fc20045a66f5f416"} Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.171299 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf583dfedb7445f2cc913dc95561dfac1bf0a973576a9666fc20045a66f5f416" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.171262 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-89ghm" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.253264 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp"] Dec 08 09:49:27 crc kubenswrapper[4776]: E1208 09:49:27.253879 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b163c8b2-3bbf-40e4-af9a-a9d706dde833" containerName="registry-server" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.253906 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b163c8b2-3bbf-40e4-af9a-a9d706dde833" containerName="registry-server" Dec 08 09:49:27 crc kubenswrapper[4776]: E1208 09:49:27.253932 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b163c8b2-3bbf-40e4-af9a-a9d706dde833" containerName="extract-utilities" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.253942 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b163c8b2-3bbf-40e4-af9a-a9d706dde833" containerName="extract-utilities" Dec 08 09:49:27 crc kubenswrapper[4776]: E1208 09:49:27.253957 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e11a91f-79d9-4a60-a644-3581ab9ce0f2" containerName="extract-utilities" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.253966 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e11a91f-79d9-4a60-a644-3581ab9ce0f2" containerName="extract-utilities" Dec 08 09:49:27 crc kubenswrapper[4776]: E1208 09:49:27.254002 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.254012 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 08 09:49:27 crc kubenswrapper[4776]: E1208 09:49:27.254042 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b163c8b2-3bbf-40e4-af9a-a9d706dde833" containerName="extract-content" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.254051 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b163c8b2-3bbf-40e4-af9a-a9d706dde833" containerName="extract-content" Dec 08 09:49:27 crc kubenswrapper[4776]: E1208 09:49:27.254066 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e11a91f-79d9-4a60-a644-3581ab9ce0f2" containerName="registry-server" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.254073 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e11a91f-79d9-4a60-a644-3581ab9ce0f2" containerName="registry-server" Dec 08 09:49:27 crc kubenswrapper[4776]: E1208 09:49:27.254087 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e11a91f-79d9-4a60-a644-3581ab9ce0f2" containerName="extract-content" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.254129 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e11a91f-79d9-4a60-a644-3581ab9ce0f2" containerName="extract-content" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.254418 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.254449 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e11a91f-79d9-4a60-a644-3581ab9ce0f2" containerName="registry-server" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.254470 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="b163c8b2-3bbf-40e4-af9a-a9d706dde833" containerName="registry-server" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.255382 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.257850 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.258204 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.259137 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.259672 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.259930 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.270908 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp"] Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.441968 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.442312 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.442362 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.442462 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.442545 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgqfd\" (UniqueName: \"kubernetes.io/projected/18a0027c-b2f9-4c57-9f94-30b31659d298-kube-api-access-vgqfd\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.442617 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.442640 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.545382 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.545452 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.545506 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.545638 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgqfd\" (UniqueName: \"kubernetes.io/projected/18a0027c-b2f9-4c57-9f94-30b31659d298-kube-api-access-vgqfd\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.545817 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.545844 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.545995 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.552091 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.558078 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.561316 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.565290 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.565930 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.568715 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.594333 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgqfd\" (UniqueName: \"kubernetes.io/projected/18a0027c-b2f9-4c57-9f94-30b31659d298-kube-api-access-vgqfd\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:27 crc kubenswrapper[4776]: I1208 09:49:27.876695 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:49:28 crc kubenswrapper[4776]: I1208 09:49:28.447340 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp"] Dec 08 09:49:29 crc kubenswrapper[4776]: I1208 09:49:29.194353 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" event={"ID":"18a0027c-b2f9-4c57-9f94-30b31659d298","Type":"ContainerStarted","Data":"099aff191f220c45dd2530a4547ba1e3b0c390e2bc6e64415665e851c1a51ed5"} Dec 08 09:49:30 crc kubenswrapper[4776]: I1208 09:49:30.204727 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" event={"ID":"18a0027c-b2f9-4c57-9f94-30b31659d298","Type":"ContainerStarted","Data":"5b93e6354aaaf5d2468e3f62f0bef2ee940d78e7c227ebbeeff82e1ad404ff13"} Dec 08 09:49:30 crc kubenswrapper[4776]: I1208 09:49:30.230558 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" podStartSLOduration=2.761874105 podStartE2EDuration="3.230536491s" podCreationTimestamp="2025-12-08 09:49:27 +0000 UTC" firstStartedPulling="2025-12-08 09:49:28.459616858 +0000 UTC m=+3044.722841880" lastFinishedPulling="2025-12-08 09:49:28.928279244 +0000 UTC m=+3045.191504266" observedRunningTime="2025-12-08 09:49:30.228379513 +0000 UTC m=+3046.491604535" watchObservedRunningTime="2025-12-08 09:49:30.230536491 +0000 UTC m=+3046.493761534" Dec 08 09:49:37 crc kubenswrapper[4776]: I1208 09:49:37.344256 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:49:37 crc kubenswrapper[4776]: E1208 09:49:37.344977 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:49:49 crc kubenswrapper[4776]: I1208 09:49:49.344296 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:49:49 crc kubenswrapper[4776]: E1208 09:49:49.344981 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:50:00 crc kubenswrapper[4776]: I1208 09:50:00.343356 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:50:00 crc kubenswrapper[4776]: E1208 09:50:00.344114 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:50:12 crc kubenswrapper[4776]: I1208 09:50:12.345001 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:50:12 crc kubenswrapper[4776]: E1208 09:50:12.345888 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:50:27 crc kubenswrapper[4776]: I1208 09:50:27.344276 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:50:27 crc kubenswrapper[4776]: E1208 09:50:27.345226 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:50:42 crc kubenswrapper[4776]: I1208 09:50:42.344525 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:50:42 crc kubenswrapper[4776]: E1208 09:50:42.345393 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:50:56 crc kubenswrapper[4776]: I1208 09:50:56.344209 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:50:56 crc kubenswrapper[4776]: E1208 09:50:56.345147 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:51:07 crc kubenswrapper[4776]: I1208 09:51:07.344254 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:51:07 crc kubenswrapper[4776]: E1208 09:51:07.345077 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:51:18 crc kubenswrapper[4776]: I1208 09:51:18.345711 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:51:18 crc kubenswrapper[4776]: E1208 09:51:18.347483 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:51:18 crc kubenswrapper[4776]: I1208 09:51:18.387756 4776 generic.go:334] "Generic (PLEG): container finished" podID="18a0027c-b2f9-4c57-9f94-30b31659d298" containerID="5b93e6354aaaf5d2468e3f62f0bef2ee940d78e7c227ebbeeff82e1ad404ff13" exitCode=0 Dec 08 09:51:18 crc kubenswrapper[4776]: I1208 09:51:18.387788 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" event={"ID":"18a0027c-b2f9-4c57-9f94-30b31659d298","Type":"ContainerDied","Data":"5b93e6354aaaf5d2468e3f62f0bef2ee940d78e7c227ebbeeff82e1ad404ff13"} Dec 08 09:51:19 crc kubenswrapper[4776]: I1208 09:51:19.906246 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.035706 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-2\") pod \"18a0027c-b2f9-4c57-9f94-30b31659d298\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.035861 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-inventory\") pod \"18a0027c-b2f9-4c57-9f94-30b31659d298\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.036003 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-0\") pod \"18a0027c-b2f9-4c57-9f94-30b31659d298\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.036156 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-1\") pod \"18a0027c-b2f9-4c57-9f94-30b31659d298\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.036258 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ssh-key\") pod \"18a0027c-b2f9-4c57-9f94-30b31659d298\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.036469 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgqfd\" (UniqueName: \"kubernetes.io/projected/18a0027c-b2f9-4c57-9f94-30b31659d298-kube-api-access-vgqfd\") pod \"18a0027c-b2f9-4c57-9f94-30b31659d298\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.036535 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-telemetry-power-monitoring-combined-ca-bundle\") pod \"18a0027c-b2f9-4c57-9f94-30b31659d298\" (UID: \"18a0027c-b2f9-4c57-9f94-30b31659d298\") " Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.047523 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "18a0027c-b2f9-4c57-9f94-30b31659d298" (UID: "18a0027c-b2f9-4c57-9f94-30b31659d298"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.047642 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18a0027c-b2f9-4c57-9f94-30b31659d298-kube-api-access-vgqfd" (OuterVolumeSpecName: "kube-api-access-vgqfd") pod "18a0027c-b2f9-4c57-9f94-30b31659d298" (UID: "18a0027c-b2f9-4c57-9f94-30b31659d298"). InnerVolumeSpecName "kube-api-access-vgqfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.070109 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-inventory" (OuterVolumeSpecName: "inventory") pod "18a0027c-b2f9-4c57-9f94-30b31659d298" (UID: "18a0027c-b2f9-4c57-9f94-30b31659d298"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.071379 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "18a0027c-b2f9-4c57-9f94-30b31659d298" (UID: "18a0027c-b2f9-4c57-9f94-30b31659d298"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.071707 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "18a0027c-b2f9-4c57-9f94-30b31659d298" (UID: "18a0027c-b2f9-4c57-9f94-30b31659d298"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.072367 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "18a0027c-b2f9-4c57-9f94-30b31659d298" (UID: "18a0027c-b2f9-4c57-9f94-30b31659d298"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.083279 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "18a0027c-b2f9-4c57-9f94-30b31659d298" (UID: "18a0027c-b2f9-4c57-9f94-30b31659d298"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.141843 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgqfd\" (UniqueName: \"kubernetes.io/projected/18a0027c-b2f9-4c57-9f94-30b31659d298-kube-api-access-vgqfd\") on node \"crc\" DevicePath \"\"" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.141893 4776 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.141907 4776 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.141923 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.141936 4776 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.141950 4776 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.141962 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18a0027c-b2f9-4c57-9f94-30b31659d298-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.411858 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" event={"ID":"18a0027c-b2f9-4c57-9f94-30b31659d298","Type":"ContainerDied","Data":"099aff191f220c45dd2530a4547ba1e3b0c390e2bc6e64415665e851c1a51ed5"} Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.412244 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="099aff191f220c45dd2530a4547ba1e3b0c390e2bc6e64415665e851c1a51ed5" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.411936 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.518353 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c"] Dec 08 09:51:20 crc kubenswrapper[4776]: E1208 09:51:20.519223 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a0027c-b2f9-4c57-9f94-30b31659d298" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.519254 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a0027c-b2f9-4c57-9f94-30b31659d298" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.519604 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="18a0027c-b2f9-4c57-9f94-30b31659d298" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.520952 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.526531 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.526562 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.526730 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.527201 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.527442 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tm845" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.532626 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c"] Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.665155 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7kf4c\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.665342 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7kf4c\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.665560 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7kf4c\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.665628 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kct8\" (UniqueName: \"kubernetes.io/projected/4e957285-89ac-4a08-a5f9-a3199e19b787-kube-api-access-4kct8\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7kf4c\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.665696 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7kf4c\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.767849 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7kf4c\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.768105 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kct8\" (UniqueName: \"kubernetes.io/projected/4e957285-89ac-4a08-a5f9-a3199e19b787-kube-api-access-4kct8\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7kf4c\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.768237 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7kf4c\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.768428 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7kf4c\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.768522 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7kf4c\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.773202 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7kf4c\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.773219 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7kf4c\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.773661 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7kf4c\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.788416 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7kf4c\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.804044 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kct8\" (UniqueName: \"kubernetes.io/projected/4e957285-89ac-4a08-a5f9-a3199e19b787-kube-api-access-4kct8\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7kf4c\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:20 crc kubenswrapper[4776]: I1208 09:51:20.859038 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:21 crc kubenswrapper[4776]: I1208 09:51:21.373116 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c"] Dec 08 09:51:21 crc kubenswrapper[4776]: I1208 09:51:21.387508 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 09:51:21 crc kubenswrapper[4776]: I1208 09:51:21.423818 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" event={"ID":"4e957285-89ac-4a08-a5f9-a3199e19b787","Type":"ContainerStarted","Data":"ec0ecdf871d814fbadebcce76f21b65f5efb76a11e889f91d7e42d094cdb23f1"} Dec 08 09:51:22 crc kubenswrapper[4776]: I1208 09:51:22.450241 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" event={"ID":"4e957285-89ac-4a08-a5f9-a3199e19b787","Type":"ContainerStarted","Data":"49f3c1d84f0457186d48b0044368cf40c27f9b52e26dbcefd8b5db262d1d1eb8"} Dec 08 09:51:32 crc kubenswrapper[4776]: I1208 09:51:32.344673 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:51:32 crc kubenswrapper[4776]: E1208 09:51:32.346985 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:51:35 crc kubenswrapper[4776]: I1208 09:51:35.595011 4776 generic.go:334] "Generic (PLEG): container finished" podID="4e957285-89ac-4a08-a5f9-a3199e19b787" containerID="49f3c1d84f0457186d48b0044368cf40c27f9b52e26dbcefd8b5db262d1d1eb8" exitCode=0 Dec 08 09:51:35 crc kubenswrapper[4776]: I1208 09:51:35.595072 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" event={"ID":"4e957285-89ac-4a08-a5f9-a3199e19b787","Type":"ContainerDied","Data":"49f3c1d84f0457186d48b0044368cf40c27f9b52e26dbcefd8b5db262d1d1eb8"} Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.071133 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.161880 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-logging-compute-config-data-1\") pod \"4e957285-89ac-4a08-a5f9-a3199e19b787\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.161932 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-ssh-key\") pod \"4e957285-89ac-4a08-a5f9-a3199e19b787\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.162015 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-logging-compute-config-data-0\") pod \"4e957285-89ac-4a08-a5f9-a3199e19b787\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.162037 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-inventory\") pod \"4e957285-89ac-4a08-a5f9-a3199e19b787\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.162193 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kct8\" (UniqueName: \"kubernetes.io/projected/4e957285-89ac-4a08-a5f9-a3199e19b787-kube-api-access-4kct8\") pod \"4e957285-89ac-4a08-a5f9-a3199e19b787\" (UID: \"4e957285-89ac-4a08-a5f9-a3199e19b787\") " Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.172638 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e957285-89ac-4a08-a5f9-a3199e19b787-kube-api-access-4kct8" (OuterVolumeSpecName: "kube-api-access-4kct8") pod "4e957285-89ac-4a08-a5f9-a3199e19b787" (UID: "4e957285-89ac-4a08-a5f9-a3199e19b787"). InnerVolumeSpecName "kube-api-access-4kct8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.195741 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "4e957285-89ac-4a08-a5f9-a3199e19b787" (UID: "4e957285-89ac-4a08-a5f9-a3199e19b787"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.198760 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4e957285-89ac-4a08-a5f9-a3199e19b787" (UID: "4e957285-89ac-4a08-a5f9-a3199e19b787"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.201907 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "4e957285-89ac-4a08-a5f9-a3199e19b787" (UID: "4e957285-89ac-4a08-a5f9-a3199e19b787"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.202292 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-inventory" (OuterVolumeSpecName: "inventory") pod "4e957285-89ac-4a08-a5f9-a3199e19b787" (UID: "4e957285-89ac-4a08-a5f9-a3199e19b787"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.264737 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kct8\" (UniqueName: \"kubernetes.io/projected/4e957285-89ac-4a08-a5f9-a3199e19b787-kube-api-access-4kct8\") on node \"crc\" DevicePath \"\"" Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.264772 4776 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.264782 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.264792 4776 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.264802 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e957285-89ac-4a08-a5f9-a3199e19b787-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.614499 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" event={"ID":"4e957285-89ac-4a08-a5f9-a3199e19b787","Type":"ContainerDied","Data":"ec0ecdf871d814fbadebcce76f21b65f5efb76a11e889f91d7e42d094cdb23f1"} Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.614544 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec0ecdf871d814fbadebcce76f21b65f5efb76a11e889f91d7e42d094cdb23f1" Dec 08 09:51:37 crc kubenswrapper[4776]: I1208 09:51:37.614543 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7kf4c" Dec 08 09:51:47 crc kubenswrapper[4776]: I1208 09:51:47.343328 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:51:47 crc kubenswrapper[4776]: E1208 09:51:47.344290 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:52:00 crc kubenswrapper[4776]: I1208 09:52:00.343819 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:52:00 crc kubenswrapper[4776]: E1208 09:52:00.344713 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:52:15 crc kubenswrapper[4776]: I1208 09:52:15.343786 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:52:15 crc kubenswrapper[4776]: E1208 09:52:15.344617 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:52:30 crc kubenswrapper[4776]: I1208 09:52:30.344069 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:52:30 crc kubenswrapper[4776]: E1208 09:52:30.344854 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:52:41 crc kubenswrapper[4776]: I1208 09:52:41.344079 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:52:41 crc kubenswrapper[4776]: E1208 09:52:41.345014 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:52:54 crc kubenswrapper[4776]: I1208 09:52:54.352741 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:52:54 crc kubenswrapper[4776]: E1208 09:52:54.353631 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:53:06 crc kubenswrapper[4776]: I1208 09:53:06.343725 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:53:06 crc kubenswrapper[4776]: E1208 09:53:06.344838 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:53:18 crc kubenswrapper[4776]: I1208 09:53:18.343827 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:53:18 crc kubenswrapper[4776]: I1208 09:53:18.679865 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"a1dd66adbd31bb54cded1863f87a95e445236b2f4953781aedd2c88ba5610dcb"} Dec 08 09:55:41 crc kubenswrapper[4776]: I1208 09:55:41.399528 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:55:41 crc kubenswrapper[4776]: I1208 09:55:41.400070 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:56:11 crc kubenswrapper[4776]: I1208 09:56:11.399611 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:56:11 crc kubenswrapper[4776]: I1208 09:56:11.400225 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:56:30 crc kubenswrapper[4776]: E1208 09:56:30.160938 4776 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.82:41490->38.102.83.82:46339: write tcp 38.102.83.82:41490->38.102.83.82:46339: write: broken pipe Dec 08 09:56:41 crc kubenswrapper[4776]: I1208 09:56:41.399473 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:56:41 crc kubenswrapper[4776]: I1208 09:56:41.400063 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:56:41 crc kubenswrapper[4776]: I1208 09:56:41.400104 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 09:56:41 crc kubenswrapper[4776]: I1208 09:56:41.401082 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a1dd66adbd31bb54cded1863f87a95e445236b2f4953781aedd2c88ba5610dcb"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 09:56:41 crc kubenswrapper[4776]: I1208 09:56:41.401138 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://a1dd66adbd31bb54cded1863f87a95e445236b2f4953781aedd2c88ba5610dcb" gracePeriod=600 Dec 08 09:56:41 crc kubenswrapper[4776]: I1208 09:56:41.972096 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="a1dd66adbd31bb54cded1863f87a95e445236b2f4953781aedd2c88ba5610dcb" exitCode=0 Dec 08 09:56:41 crc kubenswrapper[4776]: I1208 09:56:41.972165 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"a1dd66adbd31bb54cded1863f87a95e445236b2f4953781aedd2c88ba5610dcb"} Dec 08 09:56:41 crc kubenswrapper[4776]: I1208 09:56:41.972660 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1"} Dec 08 09:56:41 crc kubenswrapper[4776]: I1208 09:56:41.972680 4776 scope.go:117] "RemoveContainer" containerID="f4e7e4bb5f3b8f333b5df1fd5cf02fbeb8dd9fd2b8374d4f1f4027b6e97c653b" Dec 08 09:57:23 crc kubenswrapper[4776]: I1208 09:57:23.949797 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hrfkn"] Dec 08 09:57:23 crc kubenswrapper[4776]: E1208 09:57:23.950973 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e957285-89ac-4a08-a5f9-a3199e19b787" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 08 09:57:23 crc kubenswrapper[4776]: I1208 09:57:23.950995 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e957285-89ac-4a08-a5f9-a3199e19b787" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 08 09:57:23 crc kubenswrapper[4776]: I1208 09:57:23.951362 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e957285-89ac-4a08-a5f9-a3199e19b787" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 08 09:57:23 crc kubenswrapper[4776]: I1208 09:57:23.953117 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:23 crc kubenswrapper[4776]: I1208 09:57:23.969399 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hrfkn"] Dec 08 09:57:24 crc kubenswrapper[4776]: I1208 09:57:24.046150 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-catalog-content\") pod \"community-operators-hrfkn\" (UID: \"fbc43f17-4d64-4a4e-8234-51e2ef436ff0\") " pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:24 crc kubenswrapper[4776]: I1208 09:57:24.046330 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptm9g\" (UniqueName: \"kubernetes.io/projected/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-kube-api-access-ptm9g\") pod \"community-operators-hrfkn\" (UID: \"fbc43f17-4d64-4a4e-8234-51e2ef436ff0\") " pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:24 crc kubenswrapper[4776]: I1208 09:57:24.046364 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-utilities\") pod \"community-operators-hrfkn\" (UID: \"fbc43f17-4d64-4a4e-8234-51e2ef436ff0\") " pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:24 crc kubenswrapper[4776]: I1208 09:57:24.149401 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-catalog-content\") pod \"community-operators-hrfkn\" (UID: \"fbc43f17-4d64-4a4e-8234-51e2ef436ff0\") " pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:24 crc kubenswrapper[4776]: I1208 09:57:24.149579 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptm9g\" (UniqueName: \"kubernetes.io/projected/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-kube-api-access-ptm9g\") pod \"community-operators-hrfkn\" (UID: \"fbc43f17-4d64-4a4e-8234-51e2ef436ff0\") " pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:24 crc kubenswrapper[4776]: I1208 09:57:24.149623 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-utilities\") pod \"community-operators-hrfkn\" (UID: \"fbc43f17-4d64-4a4e-8234-51e2ef436ff0\") " pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:24 crc kubenswrapper[4776]: I1208 09:57:24.150073 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-utilities\") pod \"community-operators-hrfkn\" (UID: \"fbc43f17-4d64-4a4e-8234-51e2ef436ff0\") " pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:24 crc kubenswrapper[4776]: I1208 09:57:24.150101 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-catalog-content\") pod \"community-operators-hrfkn\" (UID: \"fbc43f17-4d64-4a4e-8234-51e2ef436ff0\") " pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:24 crc kubenswrapper[4776]: I1208 09:57:24.178116 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptm9g\" (UniqueName: \"kubernetes.io/projected/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-kube-api-access-ptm9g\") pod \"community-operators-hrfkn\" (UID: \"fbc43f17-4d64-4a4e-8234-51e2ef436ff0\") " pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:24 crc kubenswrapper[4776]: I1208 09:57:24.276521 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:24 crc kubenswrapper[4776]: I1208 09:57:24.849511 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hrfkn"] Dec 08 09:57:25 crc kubenswrapper[4776]: I1208 09:57:25.449789 4776 generic.go:334] "Generic (PLEG): container finished" podID="fbc43f17-4d64-4a4e-8234-51e2ef436ff0" containerID="046edcdd520f1208791ce900c9003fa35f56b12bcb18a4de2be51b4adb59b045" exitCode=0 Dec 08 09:57:25 crc kubenswrapper[4776]: I1208 09:57:25.449884 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrfkn" event={"ID":"fbc43f17-4d64-4a4e-8234-51e2ef436ff0","Type":"ContainerDied","Data":"046edcdd520f1208791ce900c9003fa35f56b12bcb18a4de2be51b4adb59b045"} Dec 08 09:57:25 crc kubenswrapper[4776]: I1208 09:57:25.451240 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrfkn" event={"ID":"fbc43f17-4d64-4a4e-8234-51e2ef436ff0","Type":"ContainerStarted","Data":"b6dec8e7b1f4c9f660f4af4af0e41891668c6724c4d79c623216ec4fdc8a063b"} Dec 08 09:57:25 crc kubenswrapper[4776]: I1208 09:57:25.454478 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 09:57:26 crc kubenswrapper[4776]: I1208 09:57:26.462468 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrfkn" event={"ID":"fbc43f17-4d64-4a4e-8234-51e2ef436ff0","Type":"ContainerStarted","Data":"367bc5e4221841ab5a89314382c10851c93ca08de0a663c758dbd5ccf7cd2a31"} Dec 08 09:57:27 crc kubenswrapper[4776]: I1208 09:57:27.477249 4776 generic.go:334] "Generic (PLEG): container finished" podID="fbc43f17-4d64-4a4e-8234-51e2ef436ff0" containerID="367bc5e4221841ab5a89314382c10851c93ca08de0a663c758dbd5ccf7cd2a31" exitCode=0 Dec 08 09:57:27 crc kubenswrapper[4776]: I1208 09:57:27.477370 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrfkn" event={"ID":"fbc43f17-4d64-4a4e-8234-51e2ef436ff0","Type":"ContainerDied","Data":"367bc5e4221841ab5a89314382c10851c93ca08de0a663c758dbd5ccf7cd2a31"} Dec 08 09:57:28 crc kubenswrapper[4776]: I1208 09:57:28.491746 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrfkn" event={"ID":"fbc43f17-4d64-4a4e-8234-51e2ef436ff0","Type":"ContainerStarted","Data":"447773395aca93d8c71cc2f772cc7b1885aa8978e6a8886b9177154b3d61164c"} Dec 08 09:57:28 crc kubenswrapper[4776]: I1208 09:57:28.520576 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hrfkn" podStartSLOduration=3.096926819 podStartE2EDuration="5.520556734s" podCreationTimestamp="2025-12-08 09:57:23 +0000 UTC" firstStartedPulling="2025-12-08 09:57:25.454076534 +0000 UTC m=+3521.717301576" lastFinishedPulling="2025-12-08 09:57:27.877706469 +0000 UTC m=+3524.140931491" observedRunningTime="2025-12-08 09:57:28.508691652 +0000 UTC m=+3524.771916684" watchObservedRunningTime="2025-12-08 09:57:28.520556734 +0000 UTC m=+3524.783781756" Dec 08 09:57:34 crc kubenswrapper[4776]: I1208 09:57:34.277687 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:34 crc kubenswrapper[4776]: I1208 09:57:34.279276 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:34 crc kubenswrapper[4776]: I1208 09:57:34.334969 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:34 crc kubenswrapper[4776]: I1208 09:57:34.654885 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:34 crc kubenswrapper[4776]: I1208 09:57:34.714897 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hrfkn"] Dec 08 09:57:36 crc kubenswrapper[4776]: I1208 09:57:36.610460 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hrfkn" podUID="fbc43f17-4d64-4a4e-8234-51e2ef436ff0" containerName="registry-server" containerID="cri-o://447773395aca93d8c71cc2f772cc7b1885aa8978e6a8886b9177154b3d61164c" gracePeriod=2 Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.147669 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.273344 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptm9g\" (UniqueName: \"kubernetes.io/projected/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-kube-api-access-ptm9g\") pod \"fbc43f17-4d64-4a4e-8234-51e2ef436ff0\" (UID: \"fbc43f17-4d64-4a4e-8234-51e2ef436ff0\") " Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.273605 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-utilities\") pod \"fbc43f17-4d64-4a4e-8234-51e2ef436ff0\" (UID: \"fbc43f17-4d64-4a4e-8234-51e2ef436ff0\") " Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.273674 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-catalog-content\") pod \"fbc43f17-4d64-4a4e-8234-51e2ef436ff0\" (UID: \"fbc43f17-4d64-4a4e-8234-51e2ef436ff0\") " Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.274373 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-utilities" (OuterVolumeSpecName: "utilities") pod "fbc43f17-4d64-4a4e-8234-51e2ef436ff0" (UID: "fbc43f17-4d64-4a4e-8234-51e2ef436ff0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.282415 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-kube-api-access-ptm9g" (OuterVolumeSpecName: "kube-api-access-ptm9g") pod "fbc43f17-4d64-4a4e-8234-51e2ef436ff0" (UID: "fbc43f17-4d64-4a4e-8234-51e2ef436ff0"). InnerVolumeSpecName "kube-api-access-ptm9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.337345 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbc43f17-4d64-4a4e-8234-51e2ef436ff0" (UID: "fbc43f17-4d64-4a4e-8234-51e2ef436ff0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.377008 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.377394 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.377511 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptm9g\" (UniqueName: \"kubernetes.io/projected/fbc43f17-4d64-4a4e-8234-51e2ef436ff0-kube-api-access-ptm9g\") on node \"crc\" DevicePath \"\"" Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.620777 4776 generic.go:334] "Generic (PLEG): container finished" podID="fbc43f17-4d64-4a4e-8234-51e2ef436ff0" containerID="447773395aca93d8c71cc2f772cc7b1885aa8978e6a8886b9177154b3d61164c" exitCode=0 Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.620817 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrfkn" event={"ID":"fbc43f17-4d64-4a4e-8234-51e2ef436ff0","Type":"ContainerDied","Data":"447773395aca93d8c71cc2f772cc7b1885aa8978e6a8886b9177154b3d61164c"} Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.620840 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrfkn" event={"ID":"fbc43f17-4d64-4a4e-8234-51e2ef436ff0","Type":"ContainerDied","Data":"b6dec8e7b1f4c9f660f4af4af0e41891668c6724c4d79c623216ec4fdc8a063b"} Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.620848 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hrfkn" Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.620854 4776 scope.go:117] "RemoveContainer" containerID="447773395aca93d8c71cc2f772cc7b1885aa8978e6a8886b9177154b3d61164c" Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.647160 4776 scope.go:117] "RemoveContainer" containerID="367bc5e4221841ab5a89314382c10851c93ca08de0a663c758dbd5ccf7cd2a31" Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.672335 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hrfkn"] Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.682017 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hrfkn"] Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.684746 4776 scope.go:117] "RemoveContainer" containerID="046edcdd520f1208791ce900c9003fa35f56b12bcb18a4de2be51b4adb59b045" Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.743283 4776 scope.go:117] "RemoveContainer" containerID="447773395aca93d8c71cc2f772cc7b1885aa8978e6a8886b9177154b3d61164c" Dec 08 09:57:37 crc kubenswrapper[4776]: E1208 09:57:37.743704 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"447773395aca93d8c71cc2f772cc7b1885aa8978e6a8886b9177154b3d61164c\": container with ID starting with 447773395aca93d8c71cc2f772cc7b1885aa8978e6a8886b9177154b3d61164c not found: ID does not exist" containerID="447773395aca93d8c71cc2f772cc7b1885aa8978e6a8886b9177154b3d61164c" Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.743745 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"447773395aca93d8c71cc2f772cc7b1885aa8978e6a8886b9177154b3d61164c"} err="failed to get container status \"447773395aca93d8c71cc2f772cc7b1885aa8978e6a8886b9177154b3d61164c\": rpc error: code = NotFound desc = could not find container \"447773395aca93d8c71cc2f772cc7b1885aa8978e6a8886b9177154b3d61164c\": container with ID starting with 447773395aca93d8c71cc2f772cc7b1885aa8978e6a8886b9177154b3d61164c not found: ID does not exist" Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.743771 4776 scope.go:117] "RemoveContainer" containerID="367bc5e4221841ab5a89314382c10851c93ca08de0a663c758dbd5ccf7cd2a31" Dec 08 09:57:37 crc kubenswrapper[4776]: E1208 09:57:37.744070 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"367bc5e4221841ab5a89314382c10851c93ca08de0a663c758dbd5ccf7cd2a31\": container with ID starting with 367bc5e4221841ab5a89314382c10851c93ca08de0a663c758dbd5ccf7cd2a31 not found: ID does not exist" containerID="367bc5e4221841ab5a89314382c10851c93ca08de0a663c758dbd5ccf7cd2a31" Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.744115 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"367bc5e4221841ab5a89314382c10851c93ca08de0a663c758dbd5ccf7cd2a31"} err="failed to get container status \"367bc5e4221841ab5a89314382c10851c93ca08de0a663c758dbd5ccf7cd2a31\": rpc error: code = NotFound desc = could not find container \"367bc5e4221841ab5a89314382c10851c93ca08de0a663c758dbd5ccf7cd2a31\": container with ID starting with 367bc5e4221841ab5a89314382c10851c93ca08de0a663c758dbd5ccf7cd2a31 not found: ID does not exist" Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.744144 4776 scope.go:117] "RemoveContainer" containerID="046edcdd520f1208791ce900c9003fa35f56b12bcb18a4de2be51b4adb59b045" Dec 08 09:57:37 crc kubenswrapper[4776]: E1208 09:57:37.744433 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"046edcdd520f1208791ce900c9003fa35f56b12bcb18a4de2be51b4adb59b045\": container with ID starting with 046edcdd520f1208791ce900c9003fa35f56b12bcb18a4de2be51b4adb59b045 not found: ID does not exist" containerID="046edcdd520f1208791ce900c9003fa35f56b12bcb18a4de2be51b4adb59b045" Dec 08 09:57:37 crc kubenswrapper[4776]: I1208 09:57:37.744478 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"046edcdd520f1208791ce900c9003fa35f56b12bcb18a4de2be51b4adb59b045"} err="failed to get container status \"046edcdd520f1208791ce900c9003fa35f56b12bcb18a4de2be51b4adb59b045\": rpc error: code = NotFound desc = could not find container \"046edcdd520f1208791ce900c9003fa35f56b12bcb18a4de2be51b4adb59b045\": container with ID starting with 046edcdd520f1208791ce900c9003fa35f56b12bcb18a4de2be51b4adb59b045 not found: ID does not exist" Dec 08 09:57:38 crc kubenswrapper[4776]: I1208 09:57:38.356021 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbc43f17-4d64-4a4e-8234-51e2ef436ff0" path="/var/lib/kubelet/pods/fbc43f17-4d64-4a4e-8234-51e2ef436ff0/volumes" Dec 08 09:58:08 crc kubenswrapper[4776]: I1208 09:58:08.801956 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x6mx4"] Dec 08 09:58:08 crc kubenswrapper[4776]: E1208 09:58:08.803228 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbc43f17-4d64-4a4e-8234-51e2ef436ff0" containerName="extract-utilities" Dec 08 09:58:08 crc kubenswrapper[4776]: I1208 09:58:08.803246 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc43f17-4d64-4a4e-8234-51e2ef436ff0" containerName="extract-utilities" Dec 08 09:58:08 crc kubenswrapper[4776]: E1208 09:58:08.803275 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbc43f17-4d64-4a4e-8234-51e2ef436ff0" containerName="registry-server" Dec 08 09:58:08 crc kubenswrapper[4776]: I1208 09:58:08.803283 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc43f17-4d64-4a4e-8234-51e2ef436ff0" containerName="registry-server" Dec 08 09:58:08 crc kubenswrapper[4776]: E1208 09:58:08.803297 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbc43f17-4d64-4a4e-8234-51e2ef436ff0" containerName="extract-content" Dec 08 09:58:08 crc kubenswrapper[4776]: I1208 09:58:08.803305 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc43f17-4d64-4a4e-8234-51e2ef436ff0" containerName="extract-content" Dec 08 09:58:08 crc kubenswrapper[4776]: I1208 09:58:08.803700 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbc43f17-4d64-4a4e-8234-51e2ef436ff0" containerName="registry-server" Dec 08 09:58:08 crc kubenswrapper[4776]: I1208 09:58:08.806421 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:08 crc kubenswrapper[4776]: I1208 09:58:08.823462 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6mx4"] Dec 08 09:58:08 crc kubenswrapper[4776]: I1208 09:58:08.937417 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-utilities\") pod \"certified-operators-x6mx4\" (UID: \"bc63b4e7-df88-41f6-ac1e-bdf3be2668db\") " pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:08 crc kubenswrapper[4776]: I1208 09:58:08.938140 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-catalog-content\") pod \"certified-operators-x6mx4\" (UID: \"bc63b4e7-df88-41f6-ac1e-bdf3be2668db\") " pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:08 crc kubenswrapper[4776]: I1208 09:58:08.938326 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgnsv\" (UniqueName: \"kubernetes.io/projected/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-kube-api-access-mgnsv\") pod \"certified-operators-x6mx4\" (UID: \"bc63b4e7-df88-41f6-ac1e-bdf3be2668db\") " pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:09 crc kubenswrapper[4776]: I1208 09:58:09.040141 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgnsv\" (UniqueName: \"kubernetes.io/projected/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-kube-api-access-mgnsv\") pod \"certified-operators-x6mx4\" (UID: \"bc63b4e7-df88-41f6-ac1e-bdf3be2668db\") " pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:09 crc kubenswrapper[4776]: I1208 09:58:09.040326 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-utilities\") pod \"certified-operators-x6mx4\" (UID: \"bc63b4e7-df88-41f6-ac1e-bdf3be2668db\") " pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:09 crc kubenswrapper[4776]: I1208 09:58:09.040445 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-catalog-content\") pod \"certified-operators-x6mx4\" (UID: \"bc63b4e7-df88-41f6-ac1e-bdf3be2668db\") " pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:09 crc kubenswrapper[4776]: I1208 09:58:09.040854 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-utilities\") pod \"certified-operators-x6mx4\" (UID: \"bc63b4e7-df88-41f6-ac1e-bdf3be2668db\") " pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:09 crc kubenswrapper[4776]: I1208 09:58:09.040963 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-catalog-content\") pod \"certified-operators-x6mx4\" (UID: \"bc63b4e7-df88-41f6-ac1e-bdf3be2668db\") " pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:09 crc kubenswrapper[4776]: I1208 09:58:09.059504 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgnsv\" (UniqueName: \"kubernetes.io/projected/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-kube-api-access-mgnsv\") pod \"certified-operators-x6mx4\" (UID: \"bc63b4e7-df88-41f6-ac1e-bdf3be2668db\") " pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:09 crc kubenswrapper[4776]: I1208 09:58:09.127887 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:09 crc kubenswrapper[4776]: I1208 09:58:09.768940 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6mx4"] Dec 08 09:58:09 crc kubenswrapper[4776]: I1208 09:58:09.984777 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6mx4" event={"ID":"bc63b4e7-df88-41f6-ac1e-bdf3be2668db","Type":"ContainerStarted","Data":"55b1800be1c984f46d6b31ea6dc1b55922d5323db0dd9fa03caaab9bba814f62"} Dec 08 09:58:09 crc kubenswrapper[4776]: I1208 09:58:09.985076 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6mx4" event={"ID":"bc63b4e7-df88-41f6-ac1e-bdf3be2668db","Type":"ContainerStarted","Data":"fbf467f9854b4de50c51607a451a6341ee658b7f3d3648a4126fb4f7ea92e4bd"} Dec 08 09:58:10 crc kubenswrapper[4776]: I1208 09:58:10.994199 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ktpcd"] Dec 08 09:58:10 crc kubenswrapper[4776]: I1208 09:58:10.997619 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:11 crc kubenswrapper[4776]: I1208 09:58:11.000296 4776 generic.go:334] "Generic (PLEG): container finished" podID="bc63b4e7-df88-41f6-ac1e-bdf3be2668db" containerID="55b1800be1c984f46d6b31ea6dc1b55922d5323db0dd9fa03caaab9bba814f62" exitCode=0 Dec 08 09:58:11 crc kubenswrapper[4776]: I1208 09:58:11.000367 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6mx4" event={"ID":"bc63b4e7-df88-41f6-ac1e-bdf3be2668db","Type":"ContainerDied","Data":"55b1800be1c984f46d6b31ea6dc1b55922d5323db0dd9fa03caaab9bba814f62"} Dec 08 09:58:11 crc kubenswrapper[4776]: I1208 09:58:11.012066 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktpcd"] Dec 08 09:58:11 crc kubenswrapper[4776]: I1208 09:58:11.086390 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/550f668a-02b5-4f2c-be40-6e0fa7b78027-catalog-content\") pod \"redhat-marketplace-ktpcd\" (UID: \"550f668a-02b5-4f2c-be40-6e0fa7b78027\") " pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:11 crc kubenswrapper[4776]: I1208 09:58:11.086534 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/550f668a-02b5-4f2c-be40-6e0fa7b78027-utilities\") pod \"redhat-marketplace-ktpcd\" (UID: \"550f668a-02b5-4f2c-be40-6e0fa7b78027\") " pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:11 crc kubenswrapper[4776]: I1208 09:58:11.086568 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgtxp\" (UniqueName: \"kubernetes.io/projected/550f668a-02b5-4f2c-be40-6e0fa7b78027-kube-api-access-vgtxp\") pod \"redhat-marketplace-ktpcd\" (UID: \"550f668a-02b5-4f2c-be40-6e0fa7b78027\") " pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:11 crc kubenswrapper[4776]: I1208 09:58:11.187258 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgtxp\" (UniqueName: \"kubernetes.io/projected/550f668a-02b5-4f2c-be40-6e0fa7b78027-kube-api-access-vgtxp\") pod \"redhat-marketplace-ktpcd\" (UID: \"550f668a-02b5-4f2c-be40-6e0fa7b78027\") " pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:11 crc kubenswrapper[4776]: I1208 09:58:11.187396 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/550f668a-02b5-4f2c-be40-6e0fa7b78027-catalog-content\") pod \"redhat-marketplace-ktpcd\" (UID: \"550f668a-02b5-4f2c-be40-6e0fa7b78027\") " pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:11 crc kubenswrapper[4776]: I1208 09:58:11.187488 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/550f668a-02b5-4f2c-be40-6e0fa7b78027-utilities\") pod \"redhat-marketplace-ktpcd\" (UID: \"550f668a-02b5-4f2c-be40-6e0fa7b78027\") " pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:11 crc kubenswrapper[4776]: I1208 09:58:11.187880 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/550f668a-02b5-4f2c-be40-6e0fa7b78027-catalog-content\") pod \"redhat-marketplace-ktpcd\" (UID: \"550f668a-02b5-4f2c-be40-6e0fa7b78027\") " pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:11 crc kubenswrapper[4776]: I1208 09:58:11.187907 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/550f668a-02b5-4f2c-be40-6e0fa7b78027-utilities\") pod \"redhat-marketplace-ktpcd\" (UID: \"550f668a-02b5-4f2c-be40-6e0fa7b78027\") " pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:11 crc kubenswrapper[4776]: I1208 09:58:11.207344 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgtxp\" (UniqueName: \"kubernetes.io/projected/550f668a-02b5-4f2c-be40-6e0fa7b78027-kube-api-access-vgtxp\") pod \"redhat-marketplace-ktpcd\" (UID: \"550f668a-02b5-4f2c-be40-6e0fa7b78027\") " pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:11 crc kubenswrapper[4776]: I1208 09:58:11.326958 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:11 crc kubenswrapper[4776]: I1208 09:58:11.869000 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktpcd"] Dec 08 09:58:11 crc kubenswrapper[4776]: W1208 09:58:11.883994 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod550f668a_02b5_4f2c_be40_6e0fa7b78027.slice/crio-cd29e3d8a1844d583c188ef5c0ce3b6fdfa71b85bff79144e2ded1a167e7041d WatchSource:0}: Error finding container cd29e3d8a1844d583c188ef5c0ce3b6fdfa71b85bff79144e2ded1a167e7041d: Status 404 returned error can't find the container with id cd29e3d8a1844d583c188ef5c0ce3b6fdfa71b85bff79144e2ded1a167e7041d Dec 08 09:58:11 crc kubenswrapper[4776]: I1208 09:58:11.993877 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2cpj6"] Dec 08 09:58:11 crc kubenswrapper[4776]: I1208 09:58:11.996600 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:12 crc kubenswrapper[4776]: I1208 09:58:12.004922 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59ac2f7c-b24e-40cc-908b-27d7282bc680-catalog-content\") pod \"redhat-operators-2cpj6\" (UID: \"59ac2f7c-b24e-40cc-908b-27d7282bc680\") " pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:12 crc kubenswrapper[4776]: I1208 09:58:12.005065 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5vmm\" (UniqueName: \"kubernetes.io/projected/59ac2f7c-b24e-40cc-908b-27d7282bc680-kube-api-access-s5vmm\") pod \"redhat-operators-2cpj6\" (UID: \"59ac2f7c-b24e-40cc-908b-27d7282bc680\") " pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:12 crc kubenswrapper[4776]: I1208 09:58:12.005087 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59ac2f7c-b24e-40cc-908b-27d7282bc680-utilities\") pod \"redhat-operators-2cpj6\" (UID: \"59ac2f7c-b24e-40cc-908b-27d7282bc680\") " pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:12 crc kubenswrapper[4776]: I1208 09:58:12.013269 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktpcd" event={"ID":"550f668a-02b5-4f2c-be40-6e0fa7b78027","Type":"ContainerStarted","Data":"cd29e3d8a1844d583c188ef5c0ce3b6fdfa71b85bff79144e2ded1a167e7041d"} Dec 08 09:58:12 crc kubenswrapper[4776]: I1208 09:58:12.021471 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2cpj6"] Dec 08 09:58:12 crc kubenswrapper[4776]: I1208 09:58:12.107530 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59ac2f7c-b24e-40cc-908b-27d7282bc680-catalog-content\") pod \"redhat-operators-2cpj6\" (UID: \"59ac2f7c-b24e-40cc-908b-27d7282bc680\") " pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:12 crc kubenswrapper[4776]: I1208 09:58:12.107678 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5vmm\" (UniqueName: \"kubernetes.io/projected/59ac2f7c-b24e-40cc-908b-27d7282bc680-kube-api-access-s5vmm\") pod \"redhat-operators-2cpj6\" (UID: \"59ac2f7c-b24e-40cc-908b-27d7282bc680\") " pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:12 crc kubenswrapper[4776]: I1208 09:58:12.107707 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59ac2f7c-b24e-40cc-908b-27d7282bc680-utilities\") pod \"redhat-operators-2cpj6\" (UID: \"59ac2f7c-b24e-40cc-908b-27d7282bc680\") " pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:12 crc kubenswrapper[4776]: I1208 09:58:12.108435 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59ac2f7c-b24e-40cc-908b-27d7282bc680-utilities\") pod \"redhat-operators-2cpj6\" (UID: \"59ac2f7c-b24e-40cc-908b-27d7282bc680\") " pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:12 crc kubenswrapper[4776]: I1208 09:58:12.108656 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59ac2f7c-b24e-40cc-908b-27d7282bc680-catalog-content\") pod \"redhat-operators-2cpj6\" (UID: \"59ac2f7c-b24e-40cc-908b-27d7282bc680\") " pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:12 crc kubenswrapper[4776]: I1208 09:58:12.127418 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5vmm\" (UniqueName: \"kubernetes.io/projected/59ac2f7c-b24e-40cc-908b-27d7282bc680-kube-api-access-s5vmm\") pod \"redhat-operators-2cpj6\" (UID: \"59ac2f7c-b24e-40cc-908b-27d7282bc680\") " pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:12 crc kubenswrapper[4776]: I1208 09:58:12.318251 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:13 crc kubenswrapper[4776]: I1208 09:58:13.024507 4776 generic.go:334] "Generic (PLEG): container finished" podID="550f668a-02b5-4f2c-be40-6e0fa7b78027" containerID="cd0e1ef9d139e5cbbdc0b5513274b5144663c61e13d92eb3a0d39a5e703dc2c4" exitCode=0 Dec 08 09:58:13 crc kubenswrapper[4776]: I1208 09:58:13.024602 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktpcd" event={"ID":"550f668a-02b5-4f2c-be40-6e0fa7b78027","Type":"ContainerDied","Data":"cd0e1ef9d139e5cbbdc0b5513274b5144663c61e13d92eb3a0d39a5e703dc2c4"} Dec 08 09:58:13 crc kubenswrapper[4776]: I1208 09:58:13.027901 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6mx4" event={"ID":"bc63b4e7-df88-41f6-ac1e-bdf3be2668db","Type":"ContainerStarted","Data":"7a27d01fdd124dad34f97fda4183d90cbd731126e969e19d54d9325fe3179ced"} Dec 08 09:58:13 crc kubenswrapper[4776]: I1208 09:58:13.043158 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2cpj6"] Dec 08 09:58:14 crc kubenswrapper[4776]: I1208 09:58:14.040538 4776 generic.go:334] "Generic (PLEG): container finished" podID="bc63b4e7-df88-41f6-ac1e-bdf3be2668db" containerID="7a27d01fdd124dad34f97fda4183d90cbd731126e969e19d54d9325fe3179ced" exitCode=0 Dec 08 09:58:14 crc kubenswrapper[4776]: I1208 09:58:14.040733 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6mx4" event={"ID":"bc63b4e7-df88-41f6-ac1e-bdf3be2668db","Type":"ContainerDied","Data":"7a27d01fdd124dad34f97fda4183d90cbd731126e969e19d54d9325fe3179ced"} Dec 08 09:58:14 crc kubenswrapper[4776]: I1208 09:58:14.045667 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktpcd" event={"ID":"550f668a-02b5-4f2c-be40-6e0fa7b78027","Type":"ContainerStarted","Data":"6000227a934b4113ddbd65f490449d1b43bf8221c896f0ca834b8f9cf13b8612"} Dec 08 09:58:14 crc kubenswrapper[4776]: I1208 09:58:14.048636 4776 generic.go:334] "Generic (PLEG): container finished" podID="59ac2f7c-b24e-40cc-908b-27d7282bc680" containerID="a992dcdbf4e5d154ed609ef98c686bfe152131a87d227f9823f875f1b9b7e4dd" exitCode=0 Dec 08 09:58:14 crc kubenswrapper[4776]: I1208 09:58:14.048686 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cpj6" event={"ID":"59ac2f7c-b24e-40cc-908b-27d7282bc680","Type":"ContainerDied","Data":"a992dcdbf4e5d154ed609ef98c686bfe152131a87d227f9823f875f1b9b7e4dd"} Dec 08 09:58:14 crc kubenswrapper[4776]: I1208 09:58:14.048711 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cpj6" event={"ID":"59ac2f7c-b24e-40cc-908b-27d7282bc680","Type":"ContainerStarted","Data":"5c6c31db278c79a112f486c91011fd5b5ca908fa065fbc9f7a359b056d907ab8"} Dec 08 09:58:15 crc kubenswrapper[4776]: I1208 09:58:15.061503 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cpj6" event={"ID":"59ac2f7c-b24e-40cc-908b-27d7282bc680","Type":"ContainerStarted","Data":"fda193c029bf51474e907b6cec131d11df7b39958874baf0d16eec554fb141ad"} Dec 08 09:58:15 crc kubenswrapper[4776]: I1208 09:58:15.066391 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6mx4" event={"ID":"bc63b4e7-df88-41f6-ac1e-bdf3be2668db","Type":"ContainerStarted","Data":"a5115baa596ebf0df4ec0903279de0ea6127ff6f9a70dd5cc58b1483d9a14523"} Dec 08 09:58:15 crc kubenswrapper[4776]: I1208 09:58:15.070014 4776 generic.go:334] "Generic (PLEG): container finished" podID="550f668a-02b5-4f2c-be40-6e0fa7b78027" containerID="6000227a934b4113ddbd65f490449d1b43bf8221c896f0ca834b8f9cf13b8612" exitCode=0 Dec 08 09:58:15 crc kubenswrapper[4776]: I1208 09:58:15.070053 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktpcd" event={"ID":"550f668a-02b5-4f2c-be40-6e0fa7b78027","Type":"ContainerDied","Data":"6000227a934b4113ddbd65f490449d1b43bf8221c896f0ca834b8f9cf13b8612"} Dec 08 09:58:15 crc kubenswrapper[4776]: I1208 09:58:15.135968 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x6mx4" podStartSLOduration=3.648942694 podStartE2EDuration="7.135942214s" podCreationTimestamp="2025-12-08 09:58:08 +0000 UTC" firstStartedPulling="2025-12-08 09:58:11.003413642 +0000 UTC m=+3567.266638694" lastFinishedPulling="2025-12-08 09:58:14.490413192 +0000 UTC m=+3570.753638214" observedRunningTime="2025-12-08 09:58:15.132560292 +0000 UTC m=+3571.395785334" watchObservedRunningTime="2025-12-08 09:58:15.135942214 +0000 UTC m=+3571.399167236" Dec 08 09:58:17 crc kubenswrapper[4776]: I1208 09:58:17.093814 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktpcd" event={"ID":"550f668a-02b5-4f2c-be40-6e0fa7b78027","Type":"ContainerStarted","Data":"cbeec892c66206aa25adf23cf59fa995ed5c8644809613aaafc999952d84feb0"} Dec 08 09:58:17 crc kubenswrapper[4776]: I1208 09:58:17.116783 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ktpcd" podStartSLOduration=3.819429341 podStartE2EDuration="7.116762559s" podCreationTimestamp="2025-12-08 09:58:10 +0000 UTC" firstStartedPulling="2025-12-08 09:58:13.028129336 +0000 UTC m=+3569.291354358" lastFinishedPulling="2025-12-08 09:58:16.325462554 +0000 UTC m=+3572.588687576" observedRunningTime="2025-12-08 09:58:17.108557766 +0000 UTC m=+3573.371782788" watchObservedRunningTime="2025-12-08 09:58:17.116762559 +0000 UTC m=+3573.379987581" Dec 08 09:58:18 crc kubenswrapper[4776]: I1208 09:58:18.107129 4776 generic.go:334] "Generic (PLEG): container finished" podID="59ac2f7c-b24e-40cc-908b-27d7282bc680" containerID="fda193c029bf51474e907b6cec131d11df7b39958874baf0d16eec554fb141ad" exitCode=0 Dec 08 09:58:18 crc kubenswrapper[4776]: I1208 09:58:18.107209 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cpj6" event={"ID":"59ac2f7c-b24e-40cc-908b-27d7282bc680","Type":"ContainerDied","Data":"fda193c029bf51474e907b6cec131d11df7b39958874baf0d16eec554fb141ad"} Dec 08 09:58:19 crc kubenswrapper[4776]: I1208 09:58:19.123145 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cpj6" event={"ID":"59ac2f7c-b24e-40cc-908b-27d7282bc680","Type":"ContainerStarted","Data":"a5563014169a7bfd6b4fd9a977e1dd7809c3bed080ae14b5bb493a620cede999"} Dec 08 09:58:19 crc kubenswrapper[4776]: I1208 09:58:19.128019 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:19 crc kubenswrapper[4776]: I1208 09:58:19.128093 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:19 crc kubenswrapper[4776]: I1208 09:58:19.141222 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2cpj6" podStartSLOduration=3.715630399 podStartE2EDuration="8.141200456s" podCreationTimestamp="2025-12-08 09:58:11 +0000 UTC" firstStartedPulling="2025-12-08 09:58:14.051574325 +0000 UTC m=+3570.314799347" lastFinishedPulling="2025-12-08 09:58:18.477144382 +0000 UTC m=+3574.740369404" observedRunningTime="2025-12-08 09:58:19.139665645 +0000 UTC m=+3575.402890677" watchObservedRunningTime="2025-12-08 09:58:19.141200456 +0000 UTC m=+3575.404425498" Dec 08 09:58:20 crc kubenswrapper[4776]: I1208 09:58:20.182945 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-x6mx4" podUID="bc63b4e7-df88-41f6-ac1e-bdf3be2668db" containerName="registry-server" probeResult="failure" output=< Dec 08 09:58:20 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 09:58:20 crc kubenswrapper[4776]: > Dec 08 09:58:21 crc kubenswrapper[4776]: I1208 09:58:21.327513 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:21 crc kubenswrapper[4776]: I1208 09:58:21.327863 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:21 crc kubenswrapper[4776]: I1208 09:58:21.384217 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:22 crc kubenswrapper[4776]: I1208 09:58:22.210003 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:22 crc kubenswrapper[4776]: I1208 09:58:22.319371 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:22 crc kubenswrapper[4776]: I1208 09:58:22.319479 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:22 crc kubenswrapper[4776]: I1208 09:58:22.786492 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktpcd"] Dec 08 09:58:23 crc kubenswrapper[4776]: I1208 09:58:23.371387 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2cpj6" podUID="59ac2f7c-b24e-40cc-908b-27d7282bc680" containerName="registry-server" probeResult="failure" output=< Dec 08 09:58:23 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 09:58:23 crc kubenswrapper[4776]: > Dec 08 09:58:24 crc kubenswrapper[4776]: I1208 09:58:24.169702 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ktpcd" podUID="550f668a-02b5-4f2c-be40-6e0fa7b78027" containerName="registry-server" containerID="cri-o://cbeec892c66206aa25adf23cf59fa995ed5c8644809613aaafc999952d84feb0" gracePeriod=2 Dec 08 09:58:24 crc kubenswrapper[4776]: I1208 09:58:24.677969 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:24 crc kubenswrapper[4776]: I1208 09:58:24.808287 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/550f668a-02b5-4f2c-be40-6e0fa7b78027-utilities\") pod \"550f668a-02b5-4f2c-be40-6e0fa7b78027\" (UID: \"550f668a-02b5-4f2c-be40-6e0fa7b78027\") " Dec 08 09:58:24 crc kubenswrapper[4776]: I1208 09:58:24.808620 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/550f668a-02b5-4f2c-be40-6e0fa7b78027-catalog-content\") pod \"550f668a-02b5-4f2c-be40-6e0fa7b78027\" (UID: \"550f668a-02b5-4f2c-be40-6e0fa7b78027\") " Dec 08 09:58:24 crc kubenswrapper[4776]: I1208 09:58:24.808725 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgtxp\" (UniqueName: \"kubernetes.io/projected/550f668a-02b5-4f2c-be40-6e0fa7b78027-kube-api-access-vgtxp\") pod \"550f668a-02b5-4f2c-be40-6e0fa7b78027\" (UID: \"550f668a-02b5-4f2c-be40-6e0fa7b78027\") " Dec 08 09:58:24 crc kubenswrapper[4776]: I1208 09:58:24.809452 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/550f668a-02b5-4f2c-be40-6e0fa7b78027-utilities" (OuterVolumeSpecName: "utilities") pod "550f668a-02b5-4f2c-be40-6e0fa7b78027" (UID: "550f668a-02b5-4f2c-be40-6e0fa7b78027"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:58:24 crc kubenswrapper[4776]: I1208 09:58:24.816078 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/550f668a-02b5-4f2c-be40-6e0fa7b78027-kube-api-access-vgtxp" (OuterVolumeSpecName: "kube-api-access-vgtxp") pod "550f668a-02b5-4f2c-be40-6e0fa7b78027" (UID: "550f668a-02b5-4f2c-be40-6e0fa7b78027"). InnerVolumeSpecName "kube-api-access-vgtxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:58:24 crc kubenswrapper[4776]: I1208 09:58:24.828670 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/550f668a-02b5-4f2c-be40-6e0fa7b78027-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "550f668a-02b5-4f2c-be40-6e0fa7b78027" (UID: "550f668a-02b5-4f2c-be40-6e0fa7b78027"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:58:24 crc kubenswrapper[4776]: I1208 09:58:24.912048 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/550f668a-02b5-4f2c-be40-6e0fa7b78027-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:58:24 crc kubenswrapper[4776]: I1208 09:58:24.912095 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/550f668a-02b5-4f2c-be40-6e0fa7b78027-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:58:24 crc kubenswrapper[4776]: I1208 09:58:24.912111 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgtxp\" (UniqueName: \"kubernetes.io/projected/550f668a-02b5-4f2c-be40-6e0fa7b78027-kube-api-access-vgtxp\") on node \"crc\" DevicePath \"\"" Dec 08 09:58:25 crc kubenswrapper[4776]: I1208 09:58:25.180526 4776 generic.go:334] "Generic (PLEG): container finished" podID="550f668a-02b5-4f2c-be40-6e0fa7b78027" containerID="cbeec892c66206aa25adf23cf59fa995ed5c8644809613aaafc999952d84feb0" exitCode=0 Dec 08 09:58:25 crc kubenswrapper[4776]: I1208 09:58:25.180564 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktpcd" event={"ID":"550f668a-02b5-4f2c-be40-6e0fa7b78027","Type":"ContainerDied","Data":"cbeec892c66206aa25adf23cf59fa995ed5c8644809613aaafc999952d84feb0"} Dec 08 09:58:25 crc kubenswrapper[4776]: I1208 09:58:25.180573 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktpcd" Dec 08 09:58:25 crc kubenswrapper[4776]: I1208 09:58:25.180592 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktpcd" event={"ID":"550f668a-02b5-4f2c-be40-6e0fa7b78027","Type":"ContainerDied","Data":"cd29e3d8a1844d583c188ef5c0ce3b6fdfa71b85bff79144e2ded1a167e7041d"} Dec 08 09:58:25 crc kubenswrapper[4776]: I1208 09:58:25.180609 4776 scope.go:117] "RemoveContainer" containerID="cbeec892c66206aa25adf23cf59fa995ed5c8644809613aaafc999952d84feb0" Dec 08 09:58:25 crc kubenswrapper[4776]: I1208 09:58:25.210986 4776 scope.go:117] "RemoveContainer" containerID="6000227a934b4113ddbd65f490449d1b43bf8221c896f0ca834b8f9cf13b8612" Dec 08 09:58:25 crc kubenswrapper[4776]: I1208 09:58:25.214523 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktpcd"] Dec 08 09:58:25 crc kubenswrapper[4776]: I1208 09:58:25.226666 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktpcd"] Dec 08 09:58:25 crc kubenswrapper[4776]: I1208 09:58:25.249600 4776 scope.go:117] "RemoveContainer" containerID="cd0e1ef9d139e5cbbdc0b5513274b5144663c61e13d92eb3a0d39a5e703dc2c4" Dec 08 09:58:25 crc kubenswrapper[4776]: I1208 09:58:25.285330 4776 scope.go:117] "RemoveContainer" containerID="cbeec892c66206aa25adf23cf59fa995ed5c8644809613aaafc999952d84feb0" Dec 08 09:58:25 crc kubenswrapper[4776]: E1208 09:58:25.286297 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbeec892c66206aa25adf23cf59fa995ed5c8644809613aaafc999952d84feb0\": container with ID starting with cbeec892c66206aa25adf23cf59fa995ed5c8644809613aaafc999952d84feb0 not found: ID does not exist" containerID="cbeec892c66206aa25adf23cf59fa995ed5c8644809613aaafc999952d84feb0" Dec 08 09:58:25 crc kubenswrapper[4776]: I1208 09:58:25.286336 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbeec892c66206aa25adf23cf59fa995ed5c8644809613aaafc999952d84feb0"} err="failed to get container status \"cbeec892c66206aa25adf23cf59fa995ed5c8644809613aaafc999952d84feb0\": rpc error: code = NotFound desc = could not find container \"cbeec892c66206aa25adf23cf59fa995ed5c8644809613aaafc999952d84feb0\": container with ID starting with cbeec892c66206aa25adf23cf59fa995ed5c8644809613aaafc999952d84feb0 not found: ID does not exist" Dec 08 09:58:25 crc kubenswrapper[4776]: I1208 09:58:25.286366 4776 scope.go:117] "RemoveContainer" containerID="6000227a934b4113ddbd65f490449d1b43bf8221c896f0ca834b8f9cf13b8612" Dec 08 09:58:25 crc kubenswrapper[4776]: E1208 09:58:25.286804 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6000227a934b4113ddbd65f490449d1b43bf8221c896f0ca834b8f9cf13b8612\": container with ID starting with 6000227a934b4113ddbd65f490449d1b43bf8221c896f0ca834b8f9cf13b8612 not found: ID does not exist" containerID="6000227a934b4113ddbd65f490449d1b43bf8221c896f0ca834b8f9cf13b8612" Dec 08 09:58:25 crc kubenswrapper[4776]: I1208 09:58:25.286842 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6000227a934b4113ddbd65f490449d1b43bf8221c896f0ca834b8f9cf13b8612"} err="failed to get container status \"6000227a934b4113ddbd65f490449d1b43bf8221c896f0ca834b8f9cf13b8612\": rpc error: code = NotFound desc = could not find container \"6000227a934b4113ddbd65f490449d1b43bf8221c896f0ca834b8f9cf13b8612\": container with ID starting with 6000227a934b4113ddbd65f490449d1b43bf8221c896f0ca834b8f9cf13b8612 not found: ID does not exist" Dec 08 09:58:25 crc kubenswrapper[4776]: I1208 09:58:25.286870 4776 scope.go:117] "RemoveContainer" containerID="cd0e1ef9d139e5cbbdc0b5513274b5144663c61e13d92eb3a0d39a5e703dc2c4" Dec 08 09:58:25 crc kubenswrapper[4776]: E1208 09:58:25.287161 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd0e1ef9d139e5cbbdc0b5513274b5144663c61e13d92eb3a0d39a5e703dc2c4\": container with ID starting with cd0e1ef9d139e5cbbdc0b5513274b5144663c61e13d92eb3a0d39a5e703dc2c4 not found: ID does not exist" containerID="cd0e1ef9d139e5cbbdc0b5513274b5144663c61e13d92eb3a0d39a5e703dc2c4" Dec 08 09:58:25 crc kubenswrapper[4776]: I1208 09:58:25.287208 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd0e1ef9d139e5cbbdc0b5513274b5144663c61e13d92eb3a0d39a5e703dc2c4"} err="failed to get container status \"cd0e1ef9d139e5cbbdc0b5513274b5144663c61e13d92eb3a0d39a5e703dc2c4\": rpc error: code = NotFound desc = could not find container \"cd0e1ef9d139e5cbbdc0b5513274b5144663c61e13d92eb3a0d39a5e703dc2c4\": container with ID starting with cd0e1ef9d139e5cbbdc0b5513274b5144663c61e13d92eb3a0d39a5e703dc2c4 not found: ID does not exist" Dec 08 09:58:26 crc kubenswrapper[4776]: I1208 09:58:26.364761 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="550f668a-02b5-4f2c-be40-6e0fa7b78027" path="/var/lib/kubelet/pods/550f668a-02b5-4f2c-be40-6e0fa7b78027/volumes" Dec 08 09:58:29 crc kubenswrapper[4776]: I1208 09:58:29.193453 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:29 crc kubenswrapper[4776]: I1208 09:58:29.253095 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:29 crc kubenswrapper[4776]: I1208 09:58:29.431853 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6mx4"] Dec 08 09:58:30 crc kubenswrapper[4776]: I1208 09:58:30.236547 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x6mx4" podUID="bc63b4e7-df88-41f6-ac1e-bdf3be2668db" containerName="registry-server" containerID="cri-o://a5115baa596ebf0df4ec0903279de0ea6127ff6f9a70dd5cc58b1483d9a14523" gracePeriod=2 Dec 08 09:58:30 crc kubenswrapper[4776]: I1208 09:58:30.726645 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:30 crc kubenswrapper[4776]: I1208 09:58:30.856245 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-catalog-content\") pod \"bc63b4e7-df88-41f6-ac1e-bdf3be2668db\" (UID: \"bc63b4e7-df88-41f6-ac1e-bdf3be2668db\") " Dec 08 09:58:30 crc kubenswrapper[4776]: I1208 09:58:30.856340 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-utilities\") pod \"bc63b4e7-df88-41f6-ac1e-bdf3be2668db\" (UID: \"bc63b4e7-df88-41f6-ac1e-bdf3be2668db\") " Dec 08 09:58:30 crc kubenswrapper[4776]: I1208 09:58:30.856389 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgnsv\" (UniqueName: \"kubernetes.io/projected/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-kube-api-access-mgnsv\") pod \"bc63b4e7-df88-41f6-ac1e-bdf3be2668db\" (UID: \"bc63b4e7-df88-41f6-ac1e-bdf3be2668db\") " Dec 08 09:58:30 crc kubenswrapper[4776]: I1208 09:58:30.857409 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-utilities" (OuterVolumeSpecName: "utilities") pod "bc63b4e7-df88-41f6-ac1e-bdf3be2668db" (UID: "bc63b4e7-df88-41f6-ac1e-bdf3be2668db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:58:30 crc kubenswrapper[4776]: I1208 09:58:30.862352 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-kube-api-access-mgnsv" (OuterVolumeSpecName: "kube-api-access-mgnsv") pod "bc63b4e7-df88-41f6-ac1e-bdf3be2668db" (UID: "bc63b4e7-df88-41f6-ac1e-bdf3be2668db"). InnerVolumeSpecName "kube-api-access-mgnsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:58:30 crc kubenswrapper[4776]: I1208 09:58:30.900550 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc63b4e7-df88-41f6-ac1e-bdf3be2668db" (UID: "bc63b4e7-df88-41f6-ac1e-bdf3be2668db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:58:30 crc kubenswrapper[4776]: I1208 09:58:30.959041 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:58:30 crc kubenswrapper[4776]: I1208 09:58:30.959076 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:58:30 crc kubenswrapper[4776]: I1208 09:58:30.959086 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgnsv\" (UniqueName: \"kubernetes.io/projected/bc63b4e7-df88-41f6-ac1e-bdf3be2668db-kube-api-access-mgnsv\") on node \"crc\" DevicePath \"\"" Dec 08 09:58:31 crc kubenswrapper[4776]: I1208 09:58:31.248012 4776 generic.go:334] "Generic (PLEG): container finished" podID="bc63b4e7-df88-41f6-ac1e-bdf3be2668db" containerID="a5115baa596ebf0df4ec0903279de0ea6127ff6f9a70dd5cc58b1483d9a14523" exitCode=0 Dec 08 09:58:31 crc kubenswrapper[4776]: I1208 09:58:31.248063 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6mx4" event={"ID":"bc63b4e7-df88-41f6-ac1e-bdf3be2668db","Type":"ContainerDied","Data":"a5115baa596ebf0df4ec0903279de0ea6127ff6f9a70dd5cc58b1483d9a14523"} Dec 08 09:58:31 crc kubenswrapper[4776]: I1208 09:58:31.248110 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6mx4" Dec 08 09:58:31 crc kubenswrapper[4776]: I1208 09:58:31.248382 4776 scope.go:117] "RemoveContainer" containerID="a5115baa596ebf0df4ec0903279de0ea6127ff6f9a70dd5cc58b1483d9a14523" Dec 08 09:58:31 crc kubenswrapper[4776]: I1208 09:58:31.248367 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6mx4" event={"ID":"bc63b4e7-df88-41f6-ac1e-bdf3be2668db","Type":"ContainerDied","Data":"fbf467f9854b4de50c51607a451a6341ee658b7f3d3648a4126fb4f7ea92e4bd"} Dec 08 09:58:31 crc kubenswrapper[4776]: I1208 09:58:31.280023 4776 scope.go:117] "RemoveContainer" containerID="7a27d01fdd124dad34f97fda4183d90cbd731126e969e19d54d9325fe3179ced" Dec 08 09:58:31 crc kubenswrapper[4776]: I1208 09:58:31.292342 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6mx4"] Dec 08 09:58:31 crc kubenswrapper[4776]: I1208 09:58:31.305140 4776 scope.go:117] "RemoveContainer" containerID="55b1800be1c984f46d6b31ea6dc1b55922d5323db0dd9fa03caaab9bba814f62" Dec 08 09:58:31 crc kubenswrapper[4776]: I1208 09:58:31.305191 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x6mx4"] Dec 08 09:58:31 crc kubenswrapper[4776]: I1208 09:58:31.367609 4776 scope.go:117] "RemoveContainer" containerID="a5115baa596ebf0df4ec0903279de0ea6127ff6f9a70dd5cc58b1483d9a14523" Dec 08 09:58:31 crc kubenswrapper[4776]: E1208 09:58:31.368061 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5115baa596ebf0df4ec0903279de0ea6127ff6f9a70dd5cc58b1483d9a14523\": container with ID starting with a5115baa596ebf0df4ec0903279de0ea6127ff6f9a70dd5cc58b1483d9a14523 not found: ID does not exist" containerID="a5115baa596ebf0df4ec0903279de0ea6127ff6f9a70dd5cc58b1483d9a14523" Dec 08 09:58:31 crc kubenswrapper[4776]: I1208 09:58:31.368097 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5115baa596ebf0df4ec0903279de0ea6127ff6f9a70dd5cc58b1483d9a14523"} err="failed to get container status \"a5115baa596ebf0df4ec0903279de0ea6127ff6f9a70dd5cc58b1483d9a14523\": rpc error: code = NotFound desc = could not find container \"a5115baa596ebf0df4ec0903279de0ea6127ff6f9a70dd5cc58b1483d9a14523\": container with ID starting with a5115baa596ebf0df4ec0903279de0ea6127ff6f9a70dd5cc58b1483d9a14523 not found: ID does not exist" Dec 08 09:58:31 crc kubenswrapper[4776]: I1208 09:58:31.368119 4776 scope.go:117] "RemoveContainer" containerID="7a27d01fdd124dad34f97fda4183d90cbd731126e969e19d54d9325fe3179ced" Dec 08 09:58:31 crc kubenswrapper[4776]: E1208 09:58:31.368627 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a27d01fdd124dad34f97fda4183d90cbd731126e969e19d54d9325fe3179ced\": container with ID starting with 7a27d01fdd124dad34f97fda4183d90cbd731126e969e19d54d9325fe3179ced not found: ID does not exist" containerID="7a27d01fdd124dad34f97fda4183d90cbd731126e969e19d54d9325fe3179ced" Dec 08 09:58:31 crc kubenswrapper[4776]: I1208 09:58:31.368676 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a27d01fdd124dad34f97fda4183d90cbd731126e969e19d54d9325fe3179ced"} err="failed to get container status \"7a27d01fdd124dad34f97fda4183d90cbd731126e969e19d54d9325fe3179ced\": rpc error: code = NotFound desc = could not find container \"7a27d01fdd124dad34f97fda4183d90cbd731126e969e19d54d9325fe3179ced\": container with ID starting with 7a27d01fdd124dad34f97fda4183d90cbd731126e969e19d54d9325fe3179ced not found: ID does not exist" Dec 08 09:58:31 crc kubenswrapper[4776]: I1208 09:58:31.368710 4776 scope.go:117] "RemoveContainer" containerID="55b1800be1c984f46d6b31ea6dc1b55922d5323db0dd9fa03caaab9bba814f62" Dec 08 09:58:31 crc kubenswrapper[4776]: E1208 09:58:31.369020 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55b1800be1c984f46d6b31ea6dc1b55922d5323db0dd9fa03caaab9bba814f62\": container with ID starting with 55b1800be1c984f46d6b31ea6dc1b55922d5323db0dd9fa03caaab9bba814f62 not found: ID does not exist" containerID="55b1800be1c984f46d6b31ea6dc1b55922d5323db0dd9fa03caaab9bba814f62" Dec 08 09:58:31 crc kubenswrapper[4776]: I1208 09:58:31.369045 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55b1800be1c984f46d6b31ea6dc1b55922d5323db0dd9fa03caaab9bba814f62"} err="failed to get container status \"55b1800be1c984f46d6b31ea6dc1b55922d5323db0dd9fa03caaab9bba814f62\": rpc error: code = NotFound desc = could not find container \"55b1800be1c984f46d6b31ea6dc1b55922d5323db0dd9fa03caaab9bba814f62\": container with ID starting with 55b1800be1c984f46d6b31ea6dc1b55922d5323db0dd9fa03caaab9bba814f62 not found: ID does not exist" Dec 08 09:58:32 crc kubenswrapper[4776]: I1208 09:58:32.365872 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc63b4e7-df88-41f6-ac1e-bdf3be2668db" path="/var/lib/kubelet/pods/bc63b4e7-df88-41f6-ac1e-bdf3be2668db/volumes" Dec 08 09:58:32 crc kubenswrapper[4776]: I1208 09:58:32.383918 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:32 crc kubenswrapper[4776]: I1208 09:58:32.436587 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:33 crc kubenswrapper[4776]: I1208 09:58:33.830318 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2cpj6"] Dec 08 09:58:34 crc kubenswrapper[4776]: I1208 09:58:34.277935 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2cpj6" podUID="59ac2f7c-b24e-40cc-908b-27d7282bc680" containerName="registry-server" containerID="cri-o://a5563014169a7bfd6b4fd9a977e1dd7809c3bed080ae14b5bb493a620cede999" gracePeriod=2 Dec 08 09:58:34 crc kubenswrapper[4776]: I1208 09:58:34.805703 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:34 crc kubenswrapper[4776]: I1208 09:58:34.949485 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5vmm\" (UniqueName: \"kubernetes.io/projected/59ac2f7c-b24e-40cc-908b-27d7282bc680-kube-api-access-s5vmm\") pod \"59ac2f7c-b24e-40cc-908b-27d7282bc680\" (UID: \"59ac2f7c-b24e-40cc-908b-27d7282bc680\") " Dec 08 09:58:34 crc kubenswrapper[4776]: I1208 09:58:34.949564 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59ac2f7c-b24e-40cc-908b-27d7282bc680-catalog-content\") pod \"59ac2f7c-b24e-40cc-908b-27d7282bc680\" (UID: \"59ac2f7c-b24e-40cc-908b-27d7282bc680\") " Dec 08 09:58:34 crc kubenswrapper[4776]: I1208 09:58:34.949614 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59ac2f7c-b24e-40cc-908b-27d7282bc680-utilities\") pod \"59ac2f7c-b24e-40cc-908b-27d7282bc680\" (UID: \"59ac2f7c-b24e-40cc-908b-27d7282bc680\") " Dec 08 09:58:34 crc kubenswrapper[4776]: I1208 09:58:34.958954 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59ac2f7c-b24e-40cc-908b-27d7282bc680-utilities" (OuterVolumeSpecName: "utilities") pod "59ac2f7c-b24e-40cc-908b-27d7282bc680" (UID: "59ac2f7c-b24e-40cc-908b-27d7282bc680"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:58:34 crc kubenswrapper[4776]: I1208 09:58:34.982155 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59ac2f7c-b24e-40cc-908b-27d7282bc680-kube-api-access-s5vmm" (OuterVolumeSpecName: "kube-api-access-s5vmm") pod "59ac2f7c-b24e-40cc-908b-27d7282bc680" (UID: "59ac2f7c-b24e-40cc-908b-27d7282bc680"). InnerVolumeSpecName "kube-api-access-s5vmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.053444 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59ac2f7c-b24e-40cc-908b-27d7282bc680-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.053489 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5vmm\" (UniqueName: \"kubernetes.io/projected/59ac2f7c-b24e-40cc-908b-27d7282bc680-kube-api-access-s5vmm\") on node \"crc\" DevicePath \"\"" Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.098547 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59ac2f7c-b24e-40cc-908b-27d7282bc680-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59ac2f7c-b24e-40cc-908b-27d7282bc680" (UID: "59ac2f7c-b24e-40cc-908b-27d7282bc680"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.156220 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59ac2f7c-b24e-40cc-908b-27d7282bc680-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.291074 4776 generic.go:334] "Generic (PLEG): container finished" podID="59ac2f7c-b24e-40cc-908b-27d7282bc680" containerID="a5563014169a7bfd6b4fd9a977e1dd7809c3bed080ae14b5bb493a620cede999" exitCode=0 Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.291131 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cpj6" event={"ID":"59ac2f7c-b24e-40cc-908b-27d7282bc680","Type":"ContainerDied","Data":"a5563014169a7bfd6b4fd9a977e1dd7809c3bed080ae14b5bb493a620cede999"} Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.291220 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cpj6" event={"ID":"59ac2f7c-b24e-40cc-908b-27d7282bc680","Type":"ContainerDied","Data":"5c6c31db278c79a112f486c91011fd5b5ca908fa065fbc9f7a359b056d907ab8"} Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.291152 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2cpj6" Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.291248 4776 scope.go:117] "RemoveContainer" containerID="a5563014169a7bfd6b4fd9a977e1dd7809c3bed080ae14b5bb493a620cede999" Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.331083 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2cpj6"] Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.336306 4776 scope.go:117] "RemoveContainer" containerID="fda193c029bf51474e907b6cec131d11df7b39958874baf0d16eec554fb141ad" Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.342572 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2cpj6"] Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.370546 4776 scope.go:117] "RemoveContainer" containerID="a992dcdbf4e5d154ed609ef98c686bfe152131a87d227f9823f875f1b9b7e4dd" Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.423072 4776 scope.go:117] "RemoveContainer" containerID="a5563014169a7bfd6b4fd9a977e1dd7809c3bed080ae14b5bb493a620cede999" Dec 08 09:58:35 crc kubenswrapper[4776]: E1208 09:58:35.423612 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5563014169a7bfd6b4fd9a977e1dd7809c3bed080ae14b5bb493a620cede999\": container with ID starting with a5563014169a7bfd6b4fd9a977e1dd7809c3bed080ae14b5bb493a620cede999 not found: ID does not exist" containerID="a5563014169a7bfd6b4fd9a977e1dd7809c3bed080ae14b5bb493a620cede999" Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.423661 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5563014169a7bfd6b4fd9a977e1dd7809c3bed080ae14b5bb493a620cede999"} err="failed to get container status \"a5563014169a7bfd6b4fd9a977e1dd7809c3bed080ae14b5bb493a620cede999\": rpc error: code = NotFound desc = could not find container \"a5563014169a7bfd6b4fd9a977e1dd7809c3bed080ae14b5bb493a620cede999\": container with ID starting with a5563014169a7bfd6b4fd9a977e1dd7809c3bed080ae14b5bb493a620cede999 not found: ID does not exist" Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.423692 4776 scope.go:117] "RemoveContainer" containerID="fda193c029bf51474e907b6cec131d11df7b39958874baf0d16eec554fb141ad" Dec 08 09:58:35 crc kubenswrapper[4776]: E1208 09:58:35.425371 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fda193c029bf51474e907b6cec131d11df7b39958874baf0d16eec554fb141ad\": container with ID starting with fda193c029bf51474e907b6cec131d11df7b39958874baf0d16eec554fb141ad not found: ID does not exist" containerID="fda193c029bf51474e907b6cec131d11df7b39958874baf0d16eec554fb141ad" Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.425405 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fda193c029bf51474e907b6cec131d11df7b39958874baf0d16eec554fb141ad"} err="failed to get container status \"fda193c029bf51474e907b6cec131d11df7b39958874baf0d16eec554fb141ad\": rpc error: code = NotFound desc = could not find container \"fda193c029bf51474e907b6cec131d11df7b39958874baf0d16eec554fb141ad\": container with ID starting with fda193c029bf51474e907b6cec131d11df7b39958874baf0d16eec554fb141ad not found: ID does not exist" Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.425421 4776 scope.go:117] "RemoveContainer" containerID="a992dcdbf4e5d154ed609ef98c686bfe152131a87d227f9823f875f1b9b7e4dd" Dec 08 09:58:35 crc kubenswrapper[4776]: E1208 09:58:35.425616 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a992dcdbf4e5d154ed609ef98c686bfe152131a87d227f9823f875f1b9b7e4dd\": container with ID starting with a992dcdbf4e5d154ed609ef98c686bfe152131a87d227f9823f875f1b9b7e4dd not found: ID does not exist" containerID="a992dcdbf4e5d154ed609ef98c686bfe152131a87d227f9823f875f1b9b7e4dd" Dec 08 09:58:35 crc kubenswrapper[4776]: I1208 09:58:35.425665 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a992dcdbf4e5d154ed609ef98c686bfe152131a87d227f9823f875f1b9b7e4dd"} err="failed to get container status \"a992dcdbf4e5d154ed609ef98c686bfe152131a87d227f9823f875f1b9b7e4dd\": rpc error: code = NotFound desc = could not find container \"a992dcdbf4e5d154ed609ef98c686bfe152131a87d227f9823f875f1b9b7e4dd\": container with ID starting with a992dcdbf4e5d154ed609ef98c686bfe152131a87d227f9823f875f1b9b7e4dd not found: ID does not exist" Dec 08 09:58:36 crc kubenswrapper[4776]: I1208 09:58:36.357590 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59ac2f7c-b24e-40cc-908b-27d7282bc680" path="/var/lib/kubelet/pods/59ac2f7c-b24e-40cc-908b-27d7282bc680/volumes" Dec 08 09:58:41 crc kubenswrapper[4776]: I1208 09:58:41.399738 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:58:41 crc kubenswrapper[4776]: I1208 09:58:41.400267 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:59:11 crc kubenswrapper[4776]: I1208 09:59:11.399658 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:59:11 crc kubenswrapper[4776]: I1208 09:59:11.400447 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:59:41 crc kubenswrapper[4776]: I1208 09:59:41.398902 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 09:59:41 crc kubenswrapper[4776]: I1208 09:59:41.399501 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 09:59:41 crc kubenswrapper[4776]: I1208 09:59:41.399551 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 09:59:41 crc kubenswrapper[4776]: I1208 09:59:41.400635 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 09:59:41 crc kubenswrapper[4776]: I1208 09:59:41.400692 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" gracePeriod=600 Dec 08 09:59:41 crc kubenswrapper[4776]: E1208 09:59:41.524564 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:59:42 crc kubenswrapper[4776]: I1208 09:59:42.027843 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" exitCode=0 Dec 08 09:59:42 crc kubenswrapper[4776]: I1208 09:59:42.027914 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1"} Dec 08 09:59:42 crc kubenswrapper[4776]: I1208 09:59:42.028293 4776 scope.go:117] "RemoveContainer" containerID="a1dd66adbd31bb54cded1863f87a95e445236b2f4953781aedd2c88ba5610dcb" Dec 08 09:59:42 crc kubenswrapper[4776]: I1208 09:59:42.029413 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 09:59:42 crc kubenswrapper[4776]: E1208 09:59:42.030093 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 09:59:56 crc kubenswrapper[4776]: I1208 09:59:56.344514 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 09:59:56 crc kubenswrapper[4776]: E1208 09:59:56.345736 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.182986 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8"] Dec 08 10:00:00 crc kubenswrapper[4776]: E1208 10:00:00.184314 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550f668a-02b5-4f2c-be40-6e0fa7b78027" containerName="registry-server" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.184335 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="550f668a-02b5-4f2c-be40-6e0fa7b78027" containerName="registry-server" Dec 08 10:00:00 crc kubenswrapper[4776]: E1208 10:00:00.184375 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ac2f7c-b24e-40cc-908b-27d7282bc680" containerName="extract-content" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.184384 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ac2f7c-b24e-40cc-908b-27d7282bc680" containerName="extract-content" Dec 08 10:00:00 crc kubenswrapper[4776]: E1208 10:00:00.184404 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc63b4e7-df88-41f6-ac1e-bdf3be2668db" containerName="registry-server" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.184414 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc63b4e7-df88-41f6-ac1e-bdf3be2668db" containerName="registry-server" Dec 08 10:00:00 crc kubenswrapper[4776]: E1208 10:00:00.184443 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ac2f7c-b24e-40cc-908b-27d7282bc680" containerName="registry-server" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.184451 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ac2f7c-b24e-40cc-908b-27d7282bc680" containerName="registry-server" Dec 08 10:00:00 crc kubenswrapper[4776]: E1208 10:00:00.184465 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550f668a-02b5-4f2c-be40-6e0fa7b78027" containerName="extract-content" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.184473 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="550f668a-02b5-4f2c-be40-6e0fa7b78027" containerName="extract-content" Dec 08 10:00:00 crc kubenswrapper[4776]: E1208 10:00:00.184492 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550f668a-02b5-4f2c-be40-6e0fa7b78027" containerName="extract-utilities" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.184501 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="550f668a-02b5-4f2c-be40-6e0fa7b78027" containerName="extract-utilities" Dec 08 10:00:00 crc kubenswrapper[4776]: E1208 10:00:00.184514 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc63b4e7-df88-41f6-ac1e-bdf3be2668db" containerName="extract-utilities" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.184522 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc63b4e7-df88-41f6-ac1e-bdf3be2668db" containerName="extract-utilities" Dec 08 10:00:00 crc kubenswrapper[4776]: E1208 10:00:00.184547 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc63b4e7-df88-41f6-ac1e-bdf3be2668db" containerName="extract-content" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.184554 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc63b4e7-df88-41f6-ac1e-bdf3be2668db" containerName="extract-content" Dec 08 10:00:00 crc kubenswrapper[4776]: E1208 10:00:00.184566 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ac2f7c-b24e-40cc-908b-27d7282bc680" containerName="extract-utilities" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.184574 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ac2f7c-b24e-40cc-908b-27d7282bc680" containerName="extract-utilities" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.184847 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="550f668a-02b5-4f2c-be40-6e0fa7b78027" containerName="registry-server" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.184880 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc63b4e7-df88-41f6-ac1e-bdf3be2668db" containerName="registry-server" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.184906 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="59ac2f7c-b24e-40cc-908b-27d7282bc680" containerName="registry-server" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.186036 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.192742 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.193268 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.204748 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8"] Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.288376 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmg5l\" (UniqueName: \"kubernetes.io/projected/e82e989b-1b77-4703-a7e5-3e1eb29825e4-kube-api-access-mmg5l\") pod \"collect-profiles-29419800-767t8\" (UID: \"e82e989b-1b77-4703-a7e5-3e1eb29825e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.288861 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e82e989b-1b77-4703-a7e5-3e1eb29825e4-config-volume\") pod \"collect-profiles-29419800-767t8\" (UID: \"e82e989b-1b77-4703-a7e5-3e1eb29825e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.288921 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e82e989b-1b77-4703-a7e5-3e1eb29825e4-secret-volume\") pod \"collect-profiles-29419800-767t8\" (UID: \"e82e989b-1b77-4703-a7e5-3e1eb29825e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.392513 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmg5l\" (UniqueName: \"kubernetes.io/projected/e82e989b-1b77-4703-a7e5-3e1eb29825e4-kube-api-access-mmg5l\") pod \"collect-profiles-29419800-767t8\" (UID: \"e82e989b-1b77-4703-a7e5-3e1eb29825e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.392594 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e82e989b-1b77-4703-a7e5-3e1eb29825e4-config-volume\") pod \"collect-profiles-29419800-767t8\" (UID: \"e82e989b-1b77-4703-a7e5-3e1eb29825e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.392627 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e82e989b-1b77-4703-a7e5-3e1eb29825e4-secret-volume\") pod \"collect-profiles-29419800-767t8\" (UID: \"e82e989b-1b77-4703-a7e5-3e1eb29825e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.394139 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e82e989b-1b77-4703-a7e5-3e1eb29825e4-config-volume\") pod \"collect-profiles-29419800-767t8\" (UID: \"e82e989b-1b77-4703-a7e5-3e1eb29825e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.407152 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e82e989b-1b77-4703-a7e5-3e1eb29825e4-secret-volume\") pod \"collect-profiles-29419800-767t8\" (UID: \"e82e989b-1b77-4703-a7e5-3e1eb29825e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.409776 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmg5l\" (UniqueName: \"kubernetes.io/projected/e82e989b-1b77-4703-a7e5-3e1eb29825e4-kube-api-access-mmg5l\") pod \"collect-profiles-29419800-767t8\" (UID: \"e82e989b-1b77-4703-a7e5-3e1eb29825e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.512135 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" Dec 08 10:00:00 crc kubenswrapper[4776]: I1208 10:00:00.975206 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8"] Dec 08 10:00:00 crc kubenswrapper[4776]: W1208 10:00:00.985204 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode82e989b_1b77_4703_a7e5_3e1eb29825e4.slice/crio-0e3c245fe291e54425bab92e0fc8e5bbb8156ea93f8a9ef93bb4ba3cb1baf5bc WatchSource:0}: Error finding container 0e3c245fe291e54425bab92e0fc8e5bbb8156ea93f8a9ef93bb4ba3cb1baf5bc: Status 404 returned error can't find the container with id 0e3c245fe291e54425bab92e0fc8e5bbb8156ea93f8a9ef93bb4ba3cb1baf5bc Dec 08 10:00:01 crc kubenswrapper[4776]: I1208 10:00:01.233009 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" event={"ID":"e82e989b-1b77-4703-a7e5-3e1eb29825e4","Type":"ContainerStarted","Data":"af1975d8e1236c6cc084e3f3a5afa67b939d1e6444f7e62cd400b5be35dfe1a9"} Dec 08 10:00:01 crc kubenswrapper[4776]: I1208 10:00:01.233362 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" event={"ID":"e82e989b-1b77-4703-a7e5-3e1eb29825e4","Type":"ContainerStarted","Data":"0e3c245fe291e54425bab92e0fc8e5bbb8156ea93f8a9ef93bb4ba3cb1baf5bc"} Dec 08 10:00:01 crc kubenswrapper[4776]: I1208 10:00:01.257915 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" podStartSLOduration=1.2578985010000001 podStartE2EDuration="1.257898501s" podCreationTimestamp="2025-12-08 10:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 10:00:01.246008007 +0000 UTC m=+3677.509233049" watchObservedRunningTime="2025-12-08 10:00:01.257898501 +0000 UTC m=+3677.521123523" Dec 08 10:00:02 crc kubenswrapper[4776]: I1208 10:00:02.245617 4776 generic.go:334] "Generic (PLEG): container finished" podID="e82e989b-1b77-4703-a7e5-3e1eb29825e4" containerID="af1975d8e1236c6cc084e3f3a5afa67b939d1e6444f7e62cd400b5be35dfe1a9" exitCode=0 Dec 08 10:00:02 crc kubenswrapper[4776]: I1208 10:00:02.246766 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" event={"ID":"e82e989b-1b77-4703-a7e5-3e1eb29825e4","Type":"ContainerDied","Data":"af1975d8e1236c6cc084e3f3a5afa67b939d1e6444f7e62cd400b5be35dfe1a9"} Dec 08 10:00:03 crc kubenswrapper[4776]: I1208 10:00:03.698465 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" Dec 08 10:00:03 crc kubenswrapper[4776]: I1208 10:00:03.786272 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e82e989b-1b77-4703-a7e5-3e1eb29825e4-config-volume\") pod \"e82e989b-1b77-4703-a7e5-3e1eb29825e4\" (UID: \"e82e989b-1b77-4703-a7e5-3e1eb29825e4\") " Dec 08 10:00:03 crc kubenswrapper[4776]: I1208 10:00:03.786333 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e82e989b-1b77-4703-a7e5-3e1eb29825e4-secret-volume\") pod \"e82e989b-1b77-4703-a7e5-3e1eb29825e4\" (UID: \"e82e989b-1b77-4703-a7e5-3e1eb29825e4\") " Dec 08 10:00:03 crc kubenswrapper[4776]: I1208 10:00:03.786468 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmg5l\" (UniqueName: \"kubernetes.io/projected/e82e989b-1b77-4703-a7e5-3e1eb29825e4-kube-api-access-mmg5l\") pod \"e82e989b-1b77-4703-a7e5-3e1eb29825e4\" (UID: \"e82e989b-1b77-4703-a7e5-3e1eb29825e4\") " Dec 08 10:00:03 crc kubenswrapper[4776]: I1208 10:00:03.788204 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e82e989b-1b77-4703-a7e5-3e1eb29825e4-config-volume" (OuterVolumeSpecName: "config-volume") pod "e82e989b-1b77-4703-a7e5-3e1eb29825e4" (UID: "e82e989b-1b77-4703-a7e5-3e1eb29825e4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 10:00:03 crc kubenswrapper[4776]: I1208 10:00:03.801013 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82e989b-1b77-4703-a7e5-3e1eb29825e4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e82e989b-1b77-4703-a7e5-3e1eb29825e4" (UID: "e82e989b-1b77-4703-a7e5-3e1eb29825e4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 10:00:03 crc kubenswrapper[4776]: I1208 10:00:03.804081 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e82e989b-1b77-4703-a7e5-3e1eb29825e4-kube-api-access-mmg5l" (OuterVolumeSpecName: "kube-api-access-mmg5l") pod "e82e989b-1b77-4703-a7e5-3e1eb29825e4" (UID: "e82e989b-1b77-4703-a7e5-3e1eb29825e4"). InnerVolumeSpecName "kube-api-access-mmg5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:00:03 crc kubenswrapper[4776]: I1208 10:00:03.888461 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e82e989b-1b77-4703-a7e5-3e1eb29825e4-config-volume\") on node \"crc\" DevicePath \"\"" Dec 08 10:00:03 crc kubenswrapper[4776]: I1208 10:00:03.888489 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e82e989b-1b77-4703-a7e5-3e1eb29825e4-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 08 10:00:03 crc kubenswrapper[4776]: I1208 10:00:03.888499 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmg5l\" (UniqueName: \"kubernetes.io/projected/e82e989b-1b77-4703-a7e5-3e1eb29825e4-kube-api-access-mmg5l\") on node \"crc\" DevicePath \"\"" Dec 08 10:00:04 crc kubenswrapper[4776]: I1208 10:00:04.280462 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" event={"ID":"e82e989b-1b77-4703-a7e5-3e1eb29825e4","Type":"ContainerDied","Data":"0e3c245fe291e54425bab92e0fc8e5bbb8156ea93f8a9ef93bb4ba3cb1baf5bc"} Dec 08 10:00:04 crc kubenswrapper[4776]: I1208 10:00:04.280503 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e3c245fe291e54425bab92e0fc8e5bbb8156ea93f8a9ef93bb4ba3cb1baf5bc" Dec 08 10:00:04 crc kubenswrapper[4776]: I1208 10:00:04.280558 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8" Dec 08 10:00:04 crc kubenswrapper[4776]: I1208 10:00:04.332425 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv"] Dec 08 10:00:04 crc kubenswrapper[4776]: I1208 10:00:04.343419 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419755-kfkjv"] Dec 08 10:00:04 crc kubenswrapper[4776]: I1208 10:00:04.363636 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ce3381a-0bc7-4098-9c55-87bba4519ad8" path="/var/lib/kubelet/pods/0ce3381a-0bc7-4098-9c55-87bba4519ad8/volumes" Dec 08 10:00:09 crc kubenswrapper[4776]: I1208 10:00:09.344024 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:00:09 crc kubenswrapper[4776]: E1208 10:00:09.345423 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:00:18 crc kubenswrapper[4776]: I1208 10:00:18.784039 4776 scope.go:117] "RemoveContainer" containerID="65dc4c88e5462f7bacac85606e7d58240f6fb6053e8cd91d2dd0b98b06880905" Dec 08 10:00:24 crc kubenswrapper[4776]: I1208 10:00:24.344468 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:00:24 crc kubenswrapper[4776]: E1208 10:00:24.345472 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:00:35 crc kubenswrapper[4776]: I1208 10:00:35.344405 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:00:35 crc kubenswrapper[4776]: E1208 10:00:35.345253 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:00:46 crc kubenswrapper[4776]: I1208 10:00:46.344208 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:00:46 crc kubenswrapper[4776]: E1208 10:00:46.345045 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.175776 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29419801-s7tdk"] Dec 08 10:01:00 crc kubenswrapper[4776]: E1208 10:01:00.177538 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82e989b-1b77-4703-a7e5-3e1eb29825e4" containerName="collect-profiles" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.177564 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82e989b-1b77-4703-a7e5-3e1eb29825e4" containerName="collect-profiles" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.177977 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82e989b-1b77-4703-a7e5-3e1eb29825e4" containerName="collect-profiles" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.179271 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29419801-s7tdk" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.188424 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29419801-s7tdk"] Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.290929 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-config-data\") pod \"keystone-cron-29419801-s7tdk\" (UID: \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\") " pod="openstack/keystone-cron-29419801-s7tdk" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.291105 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5gq9\" (UniqueName: \"kubernetes.io/projected/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-kube-api-access-m5gq9\") pod \"keystone-cron-29419801-s7tdk\" (UID: \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\") " pod="openstack/keystone-cron-29419801-s7tdk" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.291165 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-fernet-keys\") pod \"keystone-cron-29419801-s7tdk\" (UID: \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\") " pod="openstack/keystone-cron-29419801-s7tdk" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.291212 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-combined-ca-bundle\") pod \"keystone-cron-29419801-s7tdk\" (UID: \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\") " pod="openstack/keystone-cron-29419801-s7tdk" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.392458 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5gq9\" (UniqueName: \"kubernetes.io/projected/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-kube-api-access-m5gq9\") pod \"keystone-cron-29419801-s7tdk\" (UID: \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\") " pod="openstack/keystone-cron-29419801-s7tdk" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.392587 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-fernet-keys\") pod \"keystone-cron-29419801-s7tdk\" (UID: \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\") " pod="openstack/keystone-cron-29419801-s7tdk" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.392620 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-combined-ca-bundle\") pod \"keystone-cron-29419801-s7tdk\" (UID: \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\") " pod="openstack/keystone-cron-29419801-s7tdk" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.392694 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-config-data\") pod \"keystone-cron-29419801-s7tdk\" (UID: \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\") " pod="openstack/keystone-cron-29419801-s7tdk" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.404436 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-fernet-keys\") pod \"keystone-cron-29419801-s7tdk\" (UID: \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\") " pod="openstack/keystone-cron-29419801-s7tdk" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.404760 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-config-data\") pod \"keystone-cron-29419801-s7tdk\" (UID: \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\") " pod="openstack/keystone-cron-29419801-s7tdk" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.405443 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-combined-ca-bundle\") pod \"keystone-cron-29419801-s7tdk\" (UID: \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\") " pod="openstack/keystone-cron-29419801-s7tdk" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.410108 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5gq9\" (UniqueName: \"kubernetes.io/projected/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-kube-api-access-m5gq9\") pod \"keystone-cron-29419801-s7tdk\" (UID: \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\") " pod="openstack/keystone-cron-29419801-s7tdk" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.509570 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29419801-s7tdk" Dec 08 10:01:00 crc kubenswrapper[4776]: I1208 10:01:00.993650 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29419801-s7tdk"] Dec 08 10:01:01 crc kubenswrapper[4776]: W1208 10:01:01.001529 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53d2eccd_b6c4_4870_9ca0_f43dc8f0ce8e.slice/crio-21910e1b9d3ef1511041e643a82154c1273c793f5923e1939cf9c53177c45709 WatchSource:0}: Error finding container 21910e1b9d3ef1511041e643a82154c1273c793f5923e1939cf9c53177c45709: Status 404 returned error can't find the container with id 21910e1b9d3ef1511041e643a82154c1273c793f5923e1939cf9c53177c45709 Dec 08 10:01:01 crc kubenswrapper[4776]: I1208 10:01:01.345502 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:01:01 crc kubenswrapper[4776]: E1208 10:01:01.345990 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:01:01 crc kubenswrapper[4776]: I1208 10:01:01.902960 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29419801-s7tdk" event={"ID":"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e","Type":"ContainerStarted","Data":"83c8a0168a310d1660f1f32c645e076455c206a1390d96483f0ecc4666bf10d5"} Dec 08 10:01:01 crc kubenswrapper[4776]: I1208 10:01:01.903374 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29419801-s7tdk" event={"ID":"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e","Type":"ContainerStarted","Data":"21910e1b9d3ef1511041e643a82154c1273c793f5923e1939cf9c53177c45709"} Dec 08 10:01:01 crc kubenswrapper[4776]: I1208 10:01:01.925126 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29419801-s7tdk" podStartSLOduration=1.925111784 podStartE2EDuration="1.925111784s" podCreationTimestamp="2025-12-08 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 10:01:01.920121368 +0000 UTC m=+3738.183346390" watchObservedRunningTime="2025-12-08 10:01:01.925111784 +0000 UTC m=+3738.188336806" Dec 08 10:01:03 crc kubenswrapper[4776]: I1208 10:01:03.927351 4776 generic.go:334] "Generic (PLEG): container finished" podID="53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e" containerID="83c8a0168a310d1660f1f32c645e076455c206a1390d96483f0ecc4666bf10d5" exitCode=0 Dec 08 10:01:03 crc kubenswrapper[4776]: I1208 10:01:03.927429 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29419801-s7tdk" event={"ID":"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e","Type":"ContainerDied","Data":"83c8a0168a310d1660f1f32c645e076455c206a1390d96483f0ecc4666bf10d5"} Dec 08 10:01:05 crc kubenswrapper[4776]: I1208 10:01:05.340913 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29419801-s7tdk" Dec 08 10:01:05 crc kubenswrapper[4776]: I1208 10:01:05.413956 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-config-data\") pod \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\" (UID: \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\") " Dec 08 10:01:05 crc kubenswrapper[4776]: I1208 10:01:05.414517 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-combined-ca-bundle\") pod \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\" (UID: \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\") " Dec 08 10:01:05 crc kubenswrapper[4776]: I1208 10:01:05.414680 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5gq9\" (UniqueName: \"kubernetes.io/projected/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-kube-api-access-m5gq9\") pod \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\" (UID: \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\") " Dec 08 10:01:05 crc kubenswrapper[4776]: I1208 10:01:05.414754 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-fernet-keys\") pod \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\" (UID: \"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e\") " Dec 08 10:01:05 crc kubenswrapper[4776]: I1208 10:01:05.420121 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e" (UID: "53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 10:01:05 crc kubenswrapper[4776]: I1208 10:01:05.420491 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-kube-api-access-m5gq9" (OuterVolumeSpecName: "kube-api-access-m5gq9") pod "53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e" (UID: "53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e"). InnerVolumeSpecName "kube-api-access-m5gq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:01:05 crc kubenswrapper[4776]: I1208 10:01:05.447938 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e" (UID: "53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 10:01:05 crc kubenswrapper[4776]: I1208 10:01:05.479768 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-config-data" (OuterVolumeSpecName: "config-data") pod "53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e" (UID: "53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 10:01:05 crc kubenswrapper[4776]: I1208 10:01:05.517813 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 10:01:05 crc kubenswrapper[4776]: I1208 10:01:05.517850 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 10:01:05 crc kubenswrapper[4776]: I1208 10:01:05.517863 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5gq9\" (UniqueName: \"kubernetes.io/projected/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-kube-api-access-m5gq9\") on node \"crc\" DevicePath \"\"" Dec 08 10:01:05 crc kubenswrapper[4776]: I1208 10:01:05.517877 4776 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 08 10:01:05 crc kubenswrapper[4776]: I1208 10:01:05.949617 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29419801-s7tdk" event={"ID":"53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e","Type":"ContainerDied","Data":"21910e1b9d3ef1511041e643a82154c1273c793f5923e1939cf9c53177c45709"} Dec 08 10:01:05 crc kubenswrapper[4776]: I1208 10:01:05.949658 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21910e1b9d3ef1511041e643a82154c1273c793f5923e1939cf9c53177c45709" Dec 08 10:01:05 crc kubenswrapper[4776]: I1208 10:01:05.949684 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29419801-s7tdk" Dec 08 10:01:13 crc kubenswrapper[4776]: I1208 10:01:13.343916 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:01:13 crc kubenswrapper[4776]: E1208 10:01:13.344752 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:01:28 crc kubenswrapper[4776]: I1208 10:01:28.345187 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:01:28 crc kubenswrapper[4776]: E1208 10:01:28.346145 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:01:43 crc kubenswrapper[4776]: I1208 10:01:43.343924 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:01:43 crc kubenswrapper[4776]: E1208 10:01:43.344924 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:01:54 crc kubenswrapper[4776]: I1208 10:01:54.352894 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:01:54 crc kubenswrapper[4776]: E1208 10:01:54.353805 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:02:02 crc kubenswrapper[4776]: E1208 10:02:02.549983 4776 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.82:60134->38.102.83.82:46339: write tcp 38.102.83.82:60134->38.102.83.82:46339: write: broken pipe Dec 08 10:02:07 crc kubenswrapper[4776]: I1208 10:02:07.344458 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:02:07 crc kubenswrapper[4776]: E1208 10:02:07.345486 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:02:18 crc kubenswrapper[4776]: I1208 10:02:18.344100 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:02:18 crc kubenswrapper[4776]: E1208 10:02:18.344949 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:02:31 crc kubenswrapper[4776]: I1208 10:02:31.344559 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:02:31 crc kubenswrapper[4776]: E1208 10:02:31.345345 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:02:46 crc kubenswrapper[4776]: I1208 10:02:46.344353 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:02:46 crc kubenswrapper[4776]: E1208 10:02:46.345017 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:02:58 crc kubenswrapper[4776]: I1208 10:02:58.344351 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:02:58 crc kubenswrapper[4776]: E1208 10:02:58.345656 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:03:10 crc kubenswrapper[4776]: I1208 10:03:10.344461 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:03:10 crc kubenswrapper[4776]: E1208 10:03:10.345270 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:03:22 crc kubenswrapper[4776]: I1208 10:03:22.343593 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:03:22 crc kubenswrapper[4776]: E1208 10:03:22.344432 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:03:34 crc kubenswrapper[4776]: I1208 10:03:34.350977 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:03:34 crc kubenswrapper[4776]: E1208 10:03:34.351831 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:03:46 crc kubenswrapper[4776]: I1208 10:03:46.345233 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:03:46 crc kubenswrapper[4776]: E1208 10:03:46.346452 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:04:01 crc kubenswrapper[4776]: I1208 10:04:01.344340 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:04:01 crc kubenswrapper[4776]: E1208 10:04:01.345029 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:04:16 crc kubenswrapper[4776]: I1208 10:04:16.344315 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:04:16 crc kubenswrapper[4776]: E1208 10:04:16.345156 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:04:31 crc kubenswrapper[4776]: I1208 10:04:31.344027 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:04:31 crc kubenswrapper[4776]: E1208 10:04:31.344908 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:04:46 crc kubenswrapper[4776]: I1208 10:04:46.344423 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:04:47 crc kubenswrapper[4776]: I1208 10:04:47.341534 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"90adb4348f501214a70951f12e6327c6cd5766de3c2a25e42b686c34dce1ea8f"} Dec 08 10:07:11 crc kubenswrapper[4776]: I1208 10:07:11.398798 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:07:11 crc kubenswrapper[4776]: I1208 10:07:11.399309 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:07:41 crc kubenswrapper[4776]: I1208 10:07:41.399776 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:07:41 crc kubenswrapper[4776]: I1208 10:07:41.400429 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:08:11 crc kubenswrapper[4776]: I1208 10:08:11.398751 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:08:11 crc kubenswrapper[4776]: I1208 10:08:11.399423 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:08:11 crc kubenswrapper[4776]: I1208 10:08:11.399490 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 10:08:11 crc kubenswrapper[4776]: I1208 10:08:11.400683 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90adb4348f501214a70951f12e6327c6cd5766de3c2a25e42b686c34dce1ea8f"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 10:08:11 crc kubenswrapper[4776]: I1208 10:08:11.400765 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://90adb4348f501214a70951f12e6327c6cd5766de3c2a25e42b686c34dce1ea8f" gracePeriod=600 Dec 08 10:08:11 crc kubenswrapper[4776]: I1208 10:08:11.553113 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="90adb4348f501214a70951f12e6327c6cd5766de3c2a25e42b686c34dce1ea8f" exitCode=0 Dec 08 10:08:11 crc kubenswrapper[4776]: I1208 10:08:11.553154 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"90adb4348f501214a70951f12e6327c6cd5766de3c2a25e42b686c34dce1ea8f"} Dec 08 10:08:11 crc kubenswrapper[4776]: I1208 10:08:11.553261 4776 scope.go:117] "RemoveContainer" containerID="d69e4e5ed0ae6d0feba94b4d10f96b17acd36d7e4cc7e3e2616a1e9f287da6d1" Dec 08 10:08:12 crc kubenswrapper[4776]: I1208 10:08:12.563990 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c"} Dec 08 10:08:17 crc kubenswrapper[4776]: I1208 10:08:17.634457 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hxnvk"] Dec 08 10:08:17 crc kubenswrapper[4776]: E1208 10:08:17.635715 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e" containerName="keystone-cron" Dec 08 10:08:17 crc kubenswrapper[4776]: I1208 10:08:17.635731 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e" containerName="keystone-cron" Dec 08 10:08:17 crc kubenswrapper[4776]: I1208 10:08:17.636071 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e" containerName="keystone-cron" Dec 08 10:08:17 crc kubenswrapper[4776]: I1208 10:08:17.638363 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:17 crc kubenswrapper[4776]: I1208 10:08:17.647921 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxnvk"] Dec 08 10:08:17 crc kubenswrapper[4776]: I1208 10:08:17.712784 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e319243f-9c57-42e3-9a61-1fa04655a3dd-utilities\") pod \"redhat-marketplace-hxnvk\" (UID: \"e319243f-9c57-42e3-9a61-1fa04655a3dd\") " pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:17 crc kubenswrapper[4776]: I1208 10:08:17.712953 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e319243f-9c57-42e3-9a61-1fa04655a3dd-catalog-content\") pod \"redhat-marketplace-hxnvk\" (UID: \"e319243f-9c57-42e3-9a61-1fa04655a3dd\") " pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:17 crc kubenswrapper[4776]: I1208 10:08:17.712998 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb9gn\" (UniqueName: \"kubernetes.io/projected/e319243f-9c57-42e3-9a61-1fa04655a3dd-kube-api-access-qb9gn\") pod \"redhat-marketplace-hxnvk\" (UID: \"e319243f-9c57-42e3-9a61-1fa04655a3dd\") " pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:17 crc kubenswrapper[4776]: I1208 10:08:17.815660 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e319243f-9c57-42e3-9a61-1fa04655a3dd-utilities\") pod \"redhat-marketplace-hxnvk\" (UID: \"e319243f-9c57-42e3-9a61-1fa04655a3dd\") " pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:17 crc kubenswrapper[4776]: I1208 10:08:17.815809 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e319243f-9c57-42e3-9a61-1fa04655a3dd-catalog-content\") pod \"redhat-marketplace-hxnvk\" (UID: \"e319243f-9c57-42e3-9a61-1fa04655a3dd\") " pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:17 crc kubenswrapper[4776]: I1208 10:08:17.815848 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb9gn\" (UniqueName: \"kubernetes.io/projected/e319243f-9c57-42e3-9a61-1fa04655a3dd-kube-api-access-qb9gn\") pod \"redhat-marketplace-hxnvk\" (UID: \"e319243f-9c57-42e3-9a61-1fa04655a3dd\") " pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:17 crc kubenswrapper[4776]: I1208 10:08:17.816890 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e319243f-9c57-42e3-9a61-1fa04655a3dd-utilities\") pod \"redhat-marketplace-hxnvk\" (UID: \"e319243f-9c57-42e3-9a61-1fa04655a3dd\") " pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:17 crc kubenswrapper[4776]: I1208 10:08:17.816983 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e319243f-9c57-42e3-9a61-1fa04655a3dd-catalog-content\") pod \"redhat-marketplace-hxnvk\" (UID: \"e319243f-9c57-42e3-9a61-1fa04655a3dd\") " pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:17 crc kubenswrapper[4776]: I1208 10:08:17.835709 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb9gn\" (UniqueName: \"kubernetes.io/projected/e319243f-9c57-42e3-9a61-1fa04655a3dd-kube-api-access-qb9gn\") pod \"redhat-marketplace-hxnvk\" (UID: \"e319243f-9c57-42e3-9a61-1fa04655a3dd\") " pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:17 crc kubenswrapper[4776]: I1208 10:08:17.966735 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:18 crc kubenswrapper[4776]: I1208 10:08:18.450518 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxnvk"] Dec 08 10:08:18 crc kubenswrapper[4776]: I1208 10:08:18.624266 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxnvk" event={"ID":"e319243f-9c57-42e3-9a61-1fa04655a3dd","Type":"ContainerStarted","Data":"fe052b52fbde7d534d962d90d094cec303a1f6f3ad3db37048c6885ea6958780"} Dec 08 10:08:19 crc kubenswrapper[4776]: I1208 10:08:19.638329 4776 generic.go:334] "Generic (PLEG): container finished" podID="e319243f-9c57-42e3-9a61-1fa04655a3dd" containerID="04439a4071b793c9c544377738037e0b3d069df44ac4cbd46678dfae120558c4" exitCode=0 Dec 08 10:08:19 crc kubenswrapper[4776]: I1208 10:08:19.638557 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxnvk" event={"ID":"e319243f-9c57-42e3-9a61-1fa04655a3dd","Type":"ContainerDied","Data":"04439a4071b793c9c544377738037e0b3d069df44ac4cbd46678dfae120558c4"} Dec 08 10:08:19 crc kubenswrapper[4776]: I1208 10:08:19.646413 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 10:08:20 crc kubenswrapper[4776]: I1208 10:08:20.655637 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxnvk" event={"ID":"e319243f-9c57-42e3-9a61-1fa04655a3dd","Type":"ContainerStarted","Data":"3d89bb2f90cd986c451157a37474049cf963186526c6f69090610051bb52c2dd"} Dec 08 10:08:20 crc kubenswrapper[4776]: I1208 10:08:20.816347 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tlr7z"] Dec 08 10:08:20 crc kubenswrapper[4776]: I1208 10:08:20.819375 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:20 crc kubenswrapper[4776]: I1208 10:08:20.827730 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tlr7z"] Dec 08 10:08:20 crc kubenswrapper[4776]: I1208 10:08:20.901468 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e377d4-3fae-47df-a36a-dd1885a0e9d8-utilities\") pod \"redhat-operators-tlr7z\" (UID: \"88e377d4-3fae-47df-a36a-dd1885a0e9d8\") " pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:20 crc kubenswrapper[4776]: I1208 10:08:20.901666 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e377d4-3fae-47df-a36a-dd1885a0e9d8-catalog-content\") pod \"redhat-operators-tlr7z\" (UID: \"88e377d4-3fae-47df-a36a-dd1885a0e9d8\") " pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:20 crc kubenswrapper[4776]: I1208 10:08:20.901699 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7lf4\" (UniqueName: \"kubernetes.io/projected/88e377d4-3fae-47df-a36a-dd1885a0e9d8-kube-api-access-q7lf4\") pod \"redhat-operators-tlr7z\" (UID: \"88e377d4-3fae-47df-a36a-dd1885a0e9d8\") " pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.003472 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e377d4-3fae-47df-a36a-dd1885a0e9d8-catalog-content\") pod \"redhat-operators-tlr7z\" (UID: \"88e377d4-3fae-47df-a36a-dd1885a0e9d8\") " pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.003541 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7lf4\" (UniqueName: \"kubernetes.io/projected/88e377d4-3fae-47df-a36a-dd1885a0e9d8-kube-api-access-q7lf4\") pod \"redhat-operators-tlr7z\" (UID: \"88e377d4-3fae-47df-a36a-dd1885a0e9d8\") " pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.003648 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e377d4-3fae-47df-a36a-dd1885a0e9d8-utilities\") pod \"redhat-operators-tlr7z\" (UID: \"88e377d4-3fae-47df-a36a-dd1885a0e9d8\") " pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.004416 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e377d4-3fae-47df-a36a-dd1885a0e9d8-catalog-content\") pod \"redhat-operators-tlr7z\" (UID: \"88e377d4-3fae-47df-a36a-dd1885a0e9d8\") " pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.004633 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e377d4-3fae-47df-a36a-dd1885a0e9d8-utilities\") pod \"redhat-operators-tlr7z\" (UID: \"88e377d4-3fae-47df-a36a-dd1885a0e9d8\") " pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.023850 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7lf4\" (UniqueName: \"kubernetes.io/projected/88e377d4-3fae-47df-a36a-dd1885a0e9d8-kube-api-access-q7lf4\") pod \"redhat-operators-tlr7z\" (UID: \"88e377d4-3fae-47df-a36a-dd1885a0e9d8\") " pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.139622 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.418296 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9cqsr"] Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.421535 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.434873 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9cqsr"] Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.522278 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9m4g\" (UniqueName: \"kubernetes.io/projected/02174a84-0da7-48ab-84a3-3183fc40daee-kube-api-access-b9m4g\") pod \"community-operators-9cqsr\" (UID: \"02174a84-0da7-48ab-84a3-3183fc40daee\") " pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.522458 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02174a84-0da7-48ab-84a3-3183fc40daee-catalog-content\") pod \"community-operators-9cqsr\" (UID: \"02174a84-0da7-48ab-84a3-3183fc40daee\") " pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.522589 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02174a84-0da7-48ab-84a3-3183fc40daee-utilities\") pod \"community-operators-9cqsr\" (UID: \"02174a84-0da7-48ab-84a3-3183fc40daee\") " pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:21 crc kubenswrapper[4776]: W1208 10:08:21.596586 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e377d4_3fae_47df_a36a_dd1885a0e9d8.slice/crio-f87e5f452bcc0766c9812e445e7a964709a62af9732ed49dc13ca6edb605dbd5 WatchSource:0}: Error finding container f87e5f452bcc0766c9812e445e7a964709a62af9732ed49dc13ca6edb605dbd5: Status 404 returned error can't find the container with id f87e5f452bcc0766c9812e445e7a964709a62af9732ed49dc13ca6edb605dbd5 Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.618011 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tlr7z"] Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.626040 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02174a84-0da7-48ab-84a3-3183fc40daee-catalog-content\") pod \"community-operators-9cqsr\" (UID: \"02174a84-0da7-48ab-84a3-3183fc40daee\") " pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.626163 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02174a84-0da7-48ab-84a3-3183fc40daee-utilities\") pod \"community-operators-9cqsr\" (UID: \"02174a84-0da7-48ab-84a3-3183fc40daee\") " pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.626236 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9m4g\" (UniqueName: \"kubernetes.io/projected/02174a84-0da7-48ab-84a3-3183fc40daee-kube-api-access-b9m4g\") pod \"community-operators-9cqsr\" (UID: \"02174a84-0da7-48ab-84a3-3183fc40daee\") " pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.626674 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02174a84-0da7-48ab-84a3-3183fc40daee-catalog-content\") pod \"community-operators-9cqsr\" (UID: \"02174a84-0da7-48ab-84a3-3183fc40daee\") " pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.626862 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02174a84-0da7-48ab-84a3-3183fc40daee-utilities\") pod \"community-operators-9cqsr\" (UID: \"02174a84-0da7-48ab-84a3-3183fc40daee\") " pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.650078 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9m4g\" (UniqueName: \"kubernetes.io/projected/02174a84-0da7-48ab-84a3-3183fc40daee-kube-api-access-b9m4g\") pod \"community-operators-9cqsr\" (UID: \"02174a84-0da7-48ab-84a3-3183fc40daee\") " pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.677043 4776 generic.go:334] "Generic (PLEG): container finished" podID="e319243f-9c57-42e3-9a61-1fa04655a3dd" containerID="3d89bb2f90cd986c451157a37474049cf963186526c6f69090610051bb52c2dd" exitCode=0 Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.677118 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxnvk" event={"ID":"e319243f-9c57-42e3-9a61-1fa04655a3dd","Type":"ContainerDied","Data":"3d89bb2f90cd986c451157a37474049cf963186526c6f69090610051bb52c2dd"} Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.686407 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlr7z" event={"ID":"88e377d4-3fae-47df-a36a-dd1885a0e9d8","Type":"ContainerStarted","Data":"f87e5f452bcc0766c9812e445e7a964709a62af9732ed49dc13ca6edb605dbd5"} Dec 08 10:08:21 crc kubenswrapper[4776]: I1208 10:08:21.754272 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:22 crc kubenswrapper[4776]: I1208 10:08:22.256785 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9cqsr"] Dec 08 10:08:22 crc kubenswrapper[4776]: W1208 10:08:22.262515 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02174a84_0da7_48ab_84a3_3183fc40daee.slice/crio-8189639a9bc17848acf9a208f615221092fbea16eb45b89099adeff4d1d25ba5 WatchSource:0}: Error finding container 8189639a9bc17848acf9a208f615221092fbea16eb45b89099adeff4d1d25ba5: Status 404 returned error can't find the container with id 8189639a9bc17848acf9a208f615221092fbea16eb45b89099adeff4d1d25ba5 Dec 08 10:08:22 crc kubenswrapper[4776]: I1208 10:08:22.697761 4776 generic.go:334] "Generic (PLEG): container finished" podID="02174a84-0da7-48ab-84a3-3183fc40daee" containerID="4d3e1e4845d5698633ade57ed94a921b7765e4054fc5e3ee88f3b0183b5c81a1" exitCode=0 Dec 08 10:08:22 crc kubenswrapper[4776]: I1208 10:08:22.697821 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9cqsr" event={"ID":"02174a84-0da7-48ab-84a3-3183fc40daee","Type":"ContainerDied","Data":"4d3e1e4845d5698633ade57ed94a921b7765e4054fc5e3ee88f3b0183b5c81a1"} Dec 08 10:08:22 crc kubenswrapper[4776]: I1208 10:08:22.697848 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9cqsr" event={"ID":"02174a84-0da7-48ab-84a3-3183fc40daee","Type":"ContainerStarted","Data":"8189639a9bc17848acf9a208f615221092fbea16eb45b89099adeff4d1d25ba5"} Dec 08 10:08:22 crc kubenswrapper[4776]: I1208 10:08:22.700978 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxnvk" event={"ID":"e319243f-9c57-42e3-9a61-1fa04655a3dd","Type":"ContainerStarted","Data":"e6cdbdcd3c1a9cbf36cd9f483fbac2d09e2147638a83038e9476132fdcc4decf"} Dec 08 10:08:22 crc kubenswrapper[4776]: I1208 10:08:22.703092 4776 generic.go:334] "Generic (PLEG): container finished" podID="88e377d4-3fae-47df-a36a-dd1885a0e9d8" containerID="e5ad980b910a8543c7b4fc657b82671ca1006b5fe7a691f5af4889dfbc3a5e6c" exitCode=0 Dec 08 10:08:22 crc kubenswrapper[4776]: I1208 10:08:22.703140 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlr7z" event={"ID":"88e377d4-3fae-47df-a36a-dd1885a0e9d8","Type":"ContainerDied","Data":"e5ad980b910a8543c7b4fc657b82671ca1006b5fe7a691f5af4889dfbc3a5e6c"} Dec 08 10:08:22 crc kubenswrapper[4776]: I1208 10:08:22.736680 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hxnvk" podStartSLOduration=3.253824346 podStartE2EDuration="5.736663265s" podCreationTimestamp="2025-12-08 10:08:17 +0000 UTC" firstStartedPulling="2025-12-08 10:08:19.643759097 +0000 UTC m=+4175.906984119" lastFinishedPulling="2025-12-08 10:08:22.126598016 +0000 UTC m=+4178.389823038" observedRunningTime="2025-12-08 10:08:22.732728538 +0000 UTC m=+4178.995953560" watchObservedRunningTime="2025-12-08 10:08:22.736663265 +0000 UTC m=+4178.999888287" Dec 08 10:08:23 crc kubenswrapper[4776]: I1208 10:08:23.714768 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9cqsr" event={"ID":"02174a84-0da7-48ab-84a3-3183fc40daee","Type":"ContainerStarted","Data":"de9735edc0d83f0e39ce8ea8eef18c70b94ea092d002119567d70fa735a7cd63"} Dec 08 10:08:23 crc kubenswrapper[4776]: I1208 10:08:23.720074 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlr7z" event={"ID":"88e377d4-3fae-47df-a36a-dd1885a0e9d8","Type":"ContainerStarted","Data":"56fa026169f321f31320854cff49b0f6e70599e42ab7cc8477222248574e9a9b"} Dec 08 10:08:25 crc kubenswrapper[4776]: I1208 10:08:25.740200 4776 generic.go:334] "Generic (PLEG): container finished" podID="02174a84-0da7-48ab-84a3-3183fc40daee" containerID="de9735edc0d83f0e39ce8ea8eef18c70b94ea092d002119567d70fa735a7cd63" exitCode=0 Dec 08 10:08:25 crc kubenswrapper[4776]: I1208 10:08:25.740220 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9cqsr" event={"ID":"02174a84-0da7-48ab-84a3-3183fc40daee","Type":"ContainerDied","Data":"de9735edc0d83f0e39ce8ea8eef18c70b94ea092d002119567d70fa735a7cd63"} Dec 08 10:08:27 crc kubenswrapper[4776]: I1208 10:08:27.767223 4776 generic.go:334] "Generic (PLEG): container finished" podID="88e377d4-3fae-47df-a36a-dd1885a0e9d8" containerID="56fa026169f321f31320854cff49b0f6e70599e42ab7cc8477222248574e9a9b" exitCode=0 Dec 08 10:08:27 crc kubenswrapper[4776]: I1208 10:08:27.767287 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlr7z" event={"ID":"88e377d4-3fae-47df-a36a-dd1885a0e9d8","Type":"ContainerDied","Data":"56fa026169f321f31320854cff49b0f6e70599e42ab7cc8477222248574e9a9b"} Dec 08 10:08:27 crc kubenswrapper[4776]: I1208 10:08:27.771086 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9cqsr" event={"ID":"02174a84-0da7-48ab-84a3-3183fc40daee","Type":"ContainerStarted","Data":"5a3ecf8098fd6cec793da1362b7f7bba3c1c6b10b202b6efb2abbf524088703e"} Dec 08 10:08:27 crc kubenswrapper[4776]: I1208 10:08:27.817115 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9cqsr" podStartSLOduration=3.074996837 podStartE2EDuration="6.81708763s" podCreationTimestamp="2025-12-08 10:08:21 +0000 UTC" firstStartedPulling="2025-12-08 10:08:22.700063038 +0000 UTC m=+4178.963288060" lastFinishedPulling="2025-12-08 10:08:26.442153831 +0000 UTC m=+4182.705378853" observedRunningTime="2025-12-08 10:08:27.810391068 +0000 UTC m=+4184.073616090" watchObservedRunningTime="2025-12-08 10:08:27.81708763 +0000 UTC m=+4184.080312652" Dec 08 10:08:27 crc kubenswrapper[4776]: I1208 10:08:27.967502 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:27 crc kubenswrapper[4776]: I1208 10:08:27.967573 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:28 crc kubenswrapper[4776]: I1208 10:08:28.030902 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:28 crc kubenswrapper[4776]: I1208 10:08:28.785031 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlr7z" event={"ID":"88e377d4-3fae-47df-a36a-dd1885a0e9d8","Type":"ContainerStarted","Data":"e7ca594b17fe5092e49691349e3fcbf14aa0e247d58b8c0e447e1d053c6a5382"} Dec 08 10:08:28 crc kubenswrapper[4776]: I1208 10:08:28.810699 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tlr7z" podStartSLOduration=3.376489666 podStartE2EDuration="8.810678404s" podCreationTimestamp="2025-12-08 10:08:20 +0000 UTC" firstStartedPulling="2025-12-08 10:08:22.705341361 +0000 UTC m=+4178.968566383" lastFinishedPulling="2025-12-08 10:08:28.139530099 +0000 UTC m=+4184.402755121" observedRunningTime="2025-12-08 10:08:28.806425338 +0000 UTC m=+4185.069650360" watchObservedRunningTime="2025-12-08 10:08:28.810678404 +0000 UTC m=+4185.073903426" Dec 08 10:08:28 crc kubenswrapper[4776]: I1208 10:08:28.838743 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:30 crc kubenswrapper[4776]: I1208 10:08:30.807913 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxnvk"] Dec 08 10:08:30 crc kubenswrapper[4776]: I1208 10:08:30.808757 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hxnvk" podUID="e319243f-9c57-42e3-9a61-1fa04655a3dd" containerName="registry-server" containerID="cri-o://e6cdbdcd3c1a9cbf36cd9f483fbac2d09e2147638a83038e9476132fdcc4decf" gracePeriod=2 Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.139920 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.140592 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.346419 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.464473 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb9gn\" (UniqueName: \"kubernetes.io/projected/e319243f-9c57-42e3-9a61-1fa04655a3dd-kube-api-access-qb9gn\") pod \"e319243f-9c57-42e3-9a61-1fa04655a3dd\" (UID: \"e319243f-9c57-42e3-9a61-1fa04655a3dd\") " Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.464556 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e319243f-9c57-42e3-9a61-1fa04655a3dd-catalog-content\") pod \"e319243f-9c57-42e3-9a61-1fa04655a3dd\" (UID: \"e319243f-9c57-42e3-9a61-1fa04655a3dd\") " Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.464660 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e319243f-9c57-42e3-9a61-1fa04655a3dd-utilities\") pod \"e319243f-9c57-42e3-9a61-1fa04655a3dd\" (UID: \"e319243f-9c57-42e3-9a61-1fa04655a3dd\") " Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.465277 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e319243f-9c57-42e3-9a61-1fa04655a3dd-utilities" (OuterVolumeSpecName: "utilities") pod "e319243f-9c57-42e3-9a61-1fa04655a3dd" (UID: "e319243f-9c57-42e3-9a61-1fa04655a3dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.465500 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e319243f-9c57-42e3-9a61-1fa04655a3dd-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.470865 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e319243f-9c57-42e3-9a61-1fa04655a3dd-kube-api-access-qb9gn" (OuterVolumeSpecName: "kube-api-access-qb9gn") pod "e319243f-9c57-42e3-9a61-1fa04655a3dd" (UID: "e319243f-9c57-42e3-9a61-1fa04655a3dd"). InnerVolumeSpecName "kube-api-access-qb9gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.484910 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e319243f-9c57-42e3-9a61-1fa04655a3dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e319243f-9c57-42e3-9a61-1fa04655a3dd" (UID: "e319243f-9c57-42e3-9a61-1fa04655a3dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.568374 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb9gn\" (UniqueName: \"kubernetes.io/projected/e319243f-9c57-42e3-9a61-1fa04655a3dd-kube-api-access-qb9gn\") on node \"crc\" DevicePath \"\"" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.568409 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e319243f-9c57-42e3-9a61-1fa04655a3dd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.755304 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.755355 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.810499 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.840237 4776 generic.go:334] "Generic (PLEG): container finished" podID="e319243f-9c57-42e3-9a61-1fa04655a3dd" containerID="e6cdbdcd3c1a9cbf36cd9f483fbac2d09e2147638a83038e9476132fdcc4decf" exitCode=0 Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.840423 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxnvk" event={"ID":"e319243f-9c57-42e3-9a61-1fa04655a3dd","Type":"ContainerDied","Data":"e6cdbdcd3c1a9cbf36cd9f483fbac2d09e2147638a83038e9476132fdcc4decf"} Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.840475 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxnvk" event={"ID":"e319243f-9c57-42e3-9a61-1fa04655a3dd","Type":"ContainerDied","Data":"fe052b52fbde7d534d962d90d094cec303a1f6f3ad3db37048c6885ea6958780"} Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.840501 4776 scope.go:117] "RemoveContainer" containerID="e6cdbdcd3c1a9cbf36cd9f483fbac2d09e2147638a83038e9476132fdcc4decf" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.840626 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hxnvk" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.879995 4776 scope.go:117] "RemoveContainer" containerID="3d89bb2f90cd986c451157a37474049cf963186526c6f69090610051bb52c2dd" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.884995 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxnvk"] Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.906987 4776 scope.go:117] "RemoveContainer" containerID="04439a4071b793c9c544377738037e0b3d069df44ac4cbd46678dfae120558c4" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.911003 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxnvk"] Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.977761 4776 scope.go:117] "RemoveContainer" containerID="e6cdbdcd3c1a9cbf36cd9f483fbac2d09e2147638a83038e9476132fdcc4decf" Dec 08 10:08:31 crc kubenswrapper[4776]: E1208 10:08:31.978349 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6cdbdcd3c1a9cbf36cd9f483fbac2d09e2147638a83038e9476132fdcc4decf\": container with ID starting with e6cdbdcd3c1a9cbf36cd9f483fbac2d09e2147638a83038e9476132fdcc4decf not found: ID does not exist" containerID="e6cdbdcd3c1a9cbf36cd9f483fbac2d09e2147638a83038e9476132fdcc4decf" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.978392 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6cdbdcd3c1a9cbf36cd9f483fbac2d09e2147638a83038e9476132fdcc4decf"} err="failed to get container status \"e6cdbdcd3c1a9cbf36cd9f483fbac2d09e2147638a83038e9476132fdcc4decf\": rpc error: code = NotFound desc = could not find container \"e6cdbdcd3c1a9cbf36cd9f483fbac2d09e2147638a83038e9476132fdcc4decf\": container with ID starting with e6cdbdcd3c1a9cbf36cd9f483fbac2d09e2147638a83038e9476132fdcc4decf not found: ID does not exist" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.978418 4776 scope.go:117] "RemoveContainer" containerID="3d89bb2f90cd986c451157a37474049cf963186526c6f69090610051bb52c2dd" Dec 08 10:08:31 crc kubenswrapper[4776]: E1208 10:08:31.978972 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d89bb2f90cd986c451157a37474049cf963186526c6f69090610051bb52c2dd\": container with ID starting with 3d89bb2f90cd986c451157a37474049cf963186526c6f69090610051bb52c2dd not found: ID does not exist" containerID="3d89bb2f90cd986c451157a37474049cf963186526c6f69090610051bb52c2dd" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.978996 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d89bb2f90cd986c451157a37474049cf963186526c6f69090610051bb52c2dd"} err="failed to get container status \"3d89bb2f90cd986c451157a37474049cf963186526c6f69090610051bb52c2dd\": rpc error: code = NotFound desc = could not find container \"3d89bb2f90cd986c451157a37474049cf963186526c6f69090610051bb52c2dd\": container with ID starting with 3d89bb2f90cd986c451157a37474049cf963186526c6f69090610051bb52c2dd not found: ID does not exist" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.979011 4776 scope.go:117] "RemoveContainer" containerID="04439a4071b793c9c544377738037e0b3d069df44ac4cbd46678dfae120558c4" Dec 08 10:08:31 crc kubenswrapper[4776]: E1208 10:08:31.979368 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04439a4071b793c9c544377738037e0b3d069df44ac4cbd46678dfae120558c4\": container with ID starting with 04439a4071b793c9c544377738037e0b3d069df44ac4cbd46678dfae120558c4 not found: ID does not exist" containerID="04439a4071b793c9c544377738037e0b3d069df44ac4cbd46678dfae120558c4" Dec 08 10:08:31 crc kubenswrapper[4776]: I1208 10:08:31.979386 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04439a4071b793c9c544377738037e0b3d069df44ac4cbd46678dfae120558c4"} err="failed to get container status \"04439a4071b793c9c544377738037e0b3d069df44ac4cbd46678dfae120558c4\": rpc error: code = NotFound desc = could not find container \"04439a4071b793c9c544377738037e0b3d069df44ac4cbd46678dfae120558c4\": container with ID starting with 04439a4071b793c9c544377738037e0b3d069df44ac4cbd46678dfae120558c4 not found: ID does not exist" Dec 08 10:08:32 crc kubenswrapper[4776]: I1208 10:08:32.192100 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tlr7z" podUID="88e377d4-3fae-47df-a36a-dd1885a0e9d8" containerName="registry-server" probeResult="failure" output=< Dec 08 10:08:32 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 10:08:32 crc kubenswrapper[4776]: > Dec 08 10:08:32 crc kubenswrapper[4776]: I1208 10:08:32.360136 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e319243f-9c57-42e3-9a61-1fa04655a3dd" path="/var/lib/kubelet/pods/e319243f-9c57-42e3-9a61-1fa04655a3dd/volumes" Dec 08 10:08:41 crc kubenswrapper[4776]: I1208 10:08:41.192328 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:41 crc kubenswrapper[4776]: I1208 10:08:41.244383 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:41 crc kubenswrapper[4776]: I1208 10:08:41.427096 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tlr7z"] Dec 08 10:08:41 crc kubenswrapper[4776]: I1208 10:08:41.818147 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:42 crc kubenswrapper[4776]: I1208 10:08:42.963121 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tlr7z" podUID="88e377d4-3fae-47df-a36a-dd1885a0e9d8" containerName="registry-server" containerID="cri-o://e7ca594b17fe5092e49691349e3fcbf14aa0e247d58b8c0e447e1d053c6a5382" gracePeriod=2 Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.623076 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.800526 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e377d4-3fae-47df-a36a-dd1885a0e9d8-catalog-content\") pod \"88e377d4-3fae-47df-a36a-dd1885a0e9d8\" (UID: \"88e377d4-3fae-47df-a36a-dd1885a0e9d8\") " Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.800582 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e377d4-3fae-47df-a36a-dd1885a0e9d8-utilities\") pod \"88e377d4-3fae-47df-a36a-dd1885a0e9d8\" (UID: \"88e377d4-3fae-47df-a36a-dd1885a0e9d8\") " Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.800866 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7lf4\" (UniqueName: \"kubernetes.io/projected/88e377d4-3fae-47df-a36a-dd1885a0e9d8-kube-api-access-q7lf4\") pod \"88e377d4-3fae-47df-a36a-dd1885a0e9d8\" (UID: \"88e377d4-3fae-47df-a36a-dd1885a0e9d8\") " Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.802114 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88e377d4-3fae-47df-a36a-dd1885a0e9d8-utilities" (OuterVolumeSpecName: "utilities") pod "88e377d4-3fae-47df-a36a-dd1885a0e9d8" (UID: "88e377d4-3fae-47df-a36a-dd1885a0e9d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.807852 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e377d4-3fae-47df-a36a-dd1885a0e9d8-kube-api-access-q7lf4" (OuterVolumeSpecName: "kube-api-access-q7lf4") pod "88e377d4-3fae-47df-a36a-dd1885a0e9d8" (UID: "88e377d4-3fae-47df-a36a-dd1885a0e9d8"). InnerVolumeSpecName "kube-api-access-q7lf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.833078 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9cqsr"] Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.833498 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9cqsr" podUID="02174a84-0da7-48ab-84a3-3183fc40daee" containerName="registry-server" containerID="cri-o://5a3ecf8098fd6cec793da1362b7f7bba3c1c6b10b202b6efb2abbf524088703e" gracePeriod=2 Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.903510 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7lf4\" (UniqueName: \"kubernetes.io/projected/88e377d4-3fae-47df-a36a-dd1885a0e9d8-kube-api-access-q7lf4\") on node \"crc\" DevicePath \"\"" Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.903546 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e377d4-3fae-47df-a36a-dd1885a0e9d8-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.918069 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88e377d4-3fae-47df-a36a-dd1885a0e9d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88e377d4-3fae-47df-a36a-dd1885a0e9d8" (UID: "88e377d4-3fae-47df-a36a-dd1885a0e9d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.977307 4776 generic.go:334] "Generic (PLEG): container finished" podID="88e377d4-3fae-47df-a36a-dd1885a0e9d8" containerID="e7ca594b17fe5092e49691349e3fcbf14aa0e247d58b8c0e447e1d053c6a5382" exitCode=0 Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.977358 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tlr7z" Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.977376 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlr7z" event={"ID":"88e377d4-3fae-47df-a36a-dd1885a0e9d8","Type":"ContainerDied","Data":"e7ca594b17fe5092e49691349e3fcbf14aa0e247d58b8c0e447e1d053c6a5382"} Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.977980 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlr7z" event={"ID":"88e377d4-3fae-47df-a36a-dd1885a0e9d8","Type":"ContainerDied","Data":"f87e5f452bcc0766c9812e445e7a964709a62af9732ed49dc13ca6edb605dbd5"} Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.977998 4776 scope.go:117] "RemoveContainer" containerID="e7ca594b17fe5092e49691349e3fcbf14aa0e247d58b8c0e447e1d053c6a5382" Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.986568 4776 generic.go:334] "Generic (PLEG): container finished" podID="02174a84-0da7-48ab-84a3-3183fc40daee" containerID="5a3ecf8098fd6cec793da1362b7f7bba3c1c6b10b202b6efb2abbf524088703e" exitCode=0 Dec 08 10:08:43 crc kubenswrapper[4776]: I1208 10:08:43.986603 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9cqsr" event={"ID":"02174a84-0da7-48ab-84a3-3183fc40daee","Type":"ContainerDied","Data":"5a3ecf8098fd6cec793da1362b7f7bba3c1c6b10b202b6efb2abbf524088703e"} Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.005184 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e377d4-3fae-47df-a36a-dd1885a0e9d8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.019111 4776 scope.go:117] "RemoveContainer" containerID="56fa026169f321f31320854cff49b0f6e70599e42ab7cc8477222248574e9a9b" Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.022919 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tlr7z"] Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.034132 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tlr7z"] Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.045768 4776 scope.go:117] "RemoveContainer" containerID="e5ad980b910a8543c7b4fc657b82671ca1006b5fe7a691f5af4889dfbc3a5e6c" Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.066613 4776 scope.go:117] "RemoveContainer" containerID="e7ca594b17fe5092e49691349e3fcbf14aa0e247d58b8c0e447e1d053c6a5382" Dec 08 10:08:44 crc kubenswrapper[4776]: E1208 10:08:44.069107 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7ca594b17fe5092e49691349e3fcbf14aa0e247d58b8c0e447e1d053c6a5382\": container with ID starting with e7ca594b17fe5092e49691349e3fcbf14aa0e247d58b8c0e447e1d053c6a5382 not found: ID does not exist" containerID="e7ca594b17fe5092e49691349e3fcbf14aa0e247d58b8c0e447e1d053c6a5382" Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.069139 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7ca594b17fe5092e49691349e3fcbf14aa0e247d58b8c0e447e1d053c6a5382"} err="failed to get container status \"e7ca594b17fe5092e49691349e3fcbf14aa0e247d58b8c0e447e1d053c6a5382\": rpc error: code = NotFound desc = could not find container \"e7ca594b17fe5092e49691349e3fcbf14aa0e247d58b8c0e447e1d053c6a5382\": container with ID starting with e7ca594b17fe5092e49691349e3fcbf14aa0e247d58b8c0e447e1d053c6a5382 not found: ID does not exist" Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.069159 4776 scope.go:117] "RemoveContainer" containerID="56fa026169f321f31320854cff49b0f6e70599e42ab7cc8477222248574e9a9b" Dec 08 10:08:44 crc kubenswrapper[4776]: E1208 10:08:44.069579 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56fa026169f321f31320854cff49b0f6e70599e42ab7cc8477222248574e9a9b\": container with ID starting with 56fa026169f321f31320854cff49b0f6e70599e42ab7cc8477222248574e9a9b not found: ID does not exist" containerID="56fa026169f321f31320854cff49b0f6e70599e42ab7cc8477222248574e9a9b" Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.069619 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56fa026169f321f31320854cff49b0f6e70599e42ab7cc8477222248574e9a9b"} err="failed to get container status \"56fa026169f321f31320854cff49b0f6e70599e42ab7cc8477222248574e9a9b\": rpc error: code = NotFound desc = could not find container \"56fa026169f321f31320854cff49b0f6e70599e42ab7cc8477222248574e9a9b\": container with ID starting with 56fa026169f321f31320854cff49b0f6e70599e42ab7cc8477222248574e9a9b not found: ID does not exist" Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.069647 4776 scope.go:117] "RemoveContainer" containerID="e5ad980b910a8543c7b4fc657b82671ca1006b5fe7a691f5af4889dfbc3a5e6c" Dec 08 10:08:44 crc kubenswrapper[4776]: E1208 10:08:44.069900 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5ad980b910a8543c7b4fc657b82671ca1006b5fe7a691f5af4889dfbc3a5e6c\": container with ID starting with e5ad980b910a8543c7b4fc657b82671ca1006b5fe7a691f5af4889dfbc3a5e6c not found: ID does not exist" containerID="e5ad980b910a8543c7b4fc657b82671ca1006b5fe7a691f5af4889dfbc3a5e6c" Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.069925 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5ad980b910a8543c7b4fc657b82671ca1006b5fe7a691f5af4889dfbc3a5e6c"} err="failed to get container status \"e5ad980b910a8543c7b4fc657b82671ca1006b5fe7a691f5af4889dfbc3a5e6c\": rpc error: code = NotFound desc = could not find container \"e5ad980b910a8543c7b4fc657b82671ca1006b5fe7a691f5af4889dfbc3a5e6c\": container with ID starting with e5ad980b910a8543c7b4fc657b82671ca1006b5fe7a691f5af4889dfbc3a5e6c not found: ID does not exist" Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.352784 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.360884 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e377d4-3fae-47df-a36a-dd1885a0e9d8" path="/var/lib/kubelet/pods/88e377d4-3fae-47df-a36a-dd1885a0e9d8/volumes" Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.515467 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02174a84-0da7-48ab-84a3-3183fc40daee-utilities\") pod \"02174a84-0da7-48ab-84a3-3183fc40daee\" (UID: \"02174a84-0da7-48ab-84a3-3183fc40daee\") " Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.516206 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02174a84-0da7-48ab-84a3-3183fc40daee-catalog-content\") pod \"02174a84-0da7-48ab-84a3-3183fc40daee\" (UID: \"02174a84-0da7-48ab-84a3-3183fc40daee\") " Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.516328 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9m4g\" (UniqueName: \"kubernetes.io/projected/02174a84-0da7-48ab-84a3-3183fc40daee-kube-api-access-b9m4g\") pod \"02174a84-0da7-48ab-84a3-3183fc40daee\" (UID: \"02174a84-0da7-48ab-84a3-3183fc40daee\") " Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.516515 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02174a84-0da7-48ab-84a3-3183fc40daee-utilities" (OuterVolumeSpecName: "utilities") pod "02174a84-0da7-48ab-84a3-3183fc40daee" (UID: "02174a84-0da7-48ab-84a3-3183fc40daee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.517219 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02174a84-0da7-48ab-84a3-3183fc40daee-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.521372 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02174a84-0da7-48ab-84a3-3183fc40daee-kube-api-access-b9m4g" (OuterVolumeSpecName: "kube-api-access-b9m4g") pod "02174a84-0da7-48ab-84a3-3183fc40daee" (UID: "02174a84-0da7-48ab-84a3-3183fc40daee"). InnerVolumeSpecName "kube-api-access-b9m4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.575486 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02174a84-0da7-48ab-84a3-3183fc40daee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02174a84-0da7-48ab-84a3-3183fc40daee" (UID: "02174a84-0da7-48ab-84a3-3183fc40daee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.619375 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02174a84-0da7-48ab-84a3-3183fc40daee-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:08:44 crc kubenswrapper[4776]: I1208 10:08:44.619406 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9m4g\" (UniqueName: \"kubernetes.io/projected/02174a84-0da7-48ab-84a3-3183fc40daee-kube-api-access-b9m4g\") on node \"crc\" DevicePath \"\"" Dec 08 10:08:45 crc kubenswrapper[4776]: I1208 10:08:45.000097 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9cqsr" event={"ID":"02174a84-0da7-48ab-84a3-3183fc40daee","Type":"ContainerDied","Data":"8189639a9bc17848acf9a208f615221092fbea16eb45b89099adeff4d1d25ba5"} Dec 08 10:08:45 crc kubenswrapper[4776]: I1208 10:08:45.000144 4776 scope.go:117] "RemoveContainer" containerID="5a3ecf8098fd6cec793da1362b7f7bba3c1c6b10b202b6efb2abbf524088703e" Dec 08 10:08:45 crc kubenswrapper[4776]: I1208 10:08:45.000919 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9cqsr" Dec 08 10:08:45 crc kubenswrapper[4776]: I1208 10:08:45.032801 4776 scope.go:117] "RemoveContainer" containerID="de9735edc0d83f0e39ce8ea8eef18c70b94ea092d002119567d70fa735a7cd63" Dec 08 10:08:45 crc kubenswrapper[4776]: I1208 10:08:45.039725 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9cqsr"] Dec 08 10:08:45 crc kubenswrapper[4776]: I1208 10:08:45.054699 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9cqsr"] Dec 08 10:08:45 crc kubenswrapper[4776]: I1208 10:08:45.071748 4776 scope.go:117] "RemoveContainer" containerID="4d3e1e4845d5698633ade57ed94a921b7765e4054fc5e3ee88f3b0183b5c81a1" Dec 08 10:08:45 crc kubenswrapper[4776]: E1208 10:08:45.234398 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e377d4_3fae_47df_a36a_dd1885a0e9d8.slice\": RecentStats: unable to find data in memory cache]" Dec 08 10:08:46 crc kubenswrapper[4776]: I1208 10:08:46.372955 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02174a84-0da7-48ab-84a3-3183fc40daee" path="/var/lib/kubelet/pods/02174a84-0da7-48ab-84a3-3183fc40daee/volumes" Dec 08 10:08:48 crc kubenswrapper[4776]: E1208 10:08:48.271094 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e377d4_3fae_47df_a36a_dd1885a0e9d8.slice\": RecentStats: unable to find data in memory cache]" Dec 08 10:08:48 crc kubenswrapper[4776]: E1208 10:08:48.271139 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e377d4_3fae_47df_a36a_dd1885a0e9d8.slice\": RecentStats: unable to find data in memory cache]" Dec 08 10:08:51 crc kubenswrapper[4776]: E1208 10:08:51.242362 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e377d4_3fae_47df_a36a_dd1885a0e9d8.slice\": RecentStats: unable to find data in memory cache]" Dec 08 10:09:00 crc kubenswrapper[4776]: E1208 10:09:00.502885 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e377d4_3fae_47df_a36a_dd1885a0e9d8.slice\": RecentStats: unable to find data in memory cache]" Dec 08 10:09:01 crc kubenswrapper[4776]: E1208 10:09:01.288445 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e377d4_3fae_47df_a36a_dd1885a0e9d8.slice\": RecentStats: unable to find data in memory cache]" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.105199 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w6qd4"] Dec 08 10:09:11 crc kubenswrapper[4776]: E1208 10:09:11.106521 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e319243f-9c57-42e3-9a61-1fa04655a3dd" containerName="extract-content" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.106544 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e319243f-9c57-42e3-9a61-1fa04655a3dd" containerName="extract-content" Dec 08 10:09:11 crc kubenswrapper[4776]: E1208 10:09:11.106566 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e377d4-3fae-47df-a36a-dd1885a0e9d8" containerName="registry-server" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.106578 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e377d4-3fae-47df-a36a-dd1885a0e9d8" containerName="registry-server" Dec 08 10:09:11 crc kubenswrapper[4776]: E1208 10:09:11.106607 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02174a84-0da7-48ab-84a3-3183fc40daee" containerName="extract-utilities" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.106618 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="02174a84-0da7-48ab-84a3-3183fc40daee" containerName="extract-utilities" Dec 08 10:09:11 crc kubenswrapper[4776]: E1208 10:09:11.106635 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e319243f-9c57-42e3-9a61-1fa04655a3dd" containerName="extract-utilities" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.106646 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e319243f-9c57-42e3-9a61-1fa04655a3dd" containerName="extract-utilities" Dec 08 10:09:11 crc kubenswrapper[4776]: E1208 10:09:11.106695 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e377d4-3fae-47df-a36a-dd1885a0e9d8" containerName="extract-utilities" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.106707 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e377d4-3fae-47df-a36a-dd1885a0e9d8" containerName="extract-utilities" Dec 08 10:09:11 crc kubenswrapper[4776]: E1208 10:09:11.106747 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e319243f-9c57-42e3-9a61-1fa04655a3dd" containerName="registry-server" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.106757 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e319243f-9c57-42e3-9a61-1fa04655a3dd" containerName="registry-server" Dec 08 10:09:11 crc kubenswrapper[4776]: E1208 10:09:11.106770 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e377d4-3fae-47df-a36a-dd1885a0e9d8" containerName="extract-content" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.106780 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e377d4-3fae-47df-a36a-dd1885a0e9d8" containerName="extract-content" Dec 08 10:09:11 crc kubenswrapper[4776]: E1208 10:09:11.106802 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02174a84-0da7-48ab-84a3-3183fc40daee" containerName="extract-content" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.106814 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="02174a84-0da7-48ab-84a3-3183fc40daee" containerName="extract-content" Dec 08 10:09:11 crc kubenswrapper[4776]: E1208 10:09:11.106838 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02174a84-0da7-48ab-84a3-3183fc40daee" containerName="registry-server" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.106850 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="02174a84-0da7-48ab-84a3-3183fc40daee" containerName="registry-server" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.107298 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e319243f-9c57-42e3-9a61-1fa04655a3dd" containerName="registry-server" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.107321 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="02174a84-0da7-48ab-84a3-3183fc40daee" containerName="registry-server" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.107372 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="88e377d4-3fae-47df-a36a-dd1885a0e9d8" containerName="registry-server" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.113131 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.118618 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w6qd4"] Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.247669 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw6fp\" (UniqueName: \"kubernetes.io/projected/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-kube-api-access-hw6fp\") pod \"certified-operators-w6qd4\" (UID: \"48f53bd2-4c4f-4f22-bc6b-8379ba894c35\") " pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.248213 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-catalog-content\") pod \"certified-operators-w6qd4\" (UID: \"48f53bd2-4c4f-4f22-bc6b-8379ba894c35\") " pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.248454 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-utilities\") pod \"certified-operators-w6qd4\" (UID: \"48f53bd2-4c4f-4f22-bc6b-8379ba894c35\") " pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.350517 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-utilities\") pod \"certified-operators-w6qd4\" (UID: \"48f53bd2-4c4f-4f22-bc6b-8379ba894c35\") " pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.350627 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw6fp\" (UniqueName: \"kubernetes.io/projected/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-kube-api-access-hw6fp\") pod \"certified-operators-w6qd4\" (UID: \"48f53bd2-4c4f-4f22-bc6b-8379ba894c35\") " pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.350728 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-catalog-content\") pod \"certified-operators-w6qd4\" (UID: \"48f53bd2-4c4f-4f22-bc6b-8379ba894c35\") " pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.350896 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-utilities\") pod \"certified-operators-w6qd4\" (UID: \"48f53bd2-4c4f-4f22-bc6b-8379ba894c35\") " pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.351157 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-catalog-content\") pod \"certified-operators-w6qd4\" (UID: \"48f53bd2-4c4f-4f22-bc6b-8379ba894c35\") " pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.390878 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw6fp\" (UniqueName: \"kubernetes.io/projected/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-kube-api-access-hw6fp\") pod \"certified-operators-w6qd4\" (UID: \"48f53bd2-4c4f-4f22-bc6b-8379ba894c35\") " pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:11 crc kubenswrapper[4776]: I1208 10:09:11.439137 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:11 crc kubenswrapper[4776]: E1208 10:09:11.556756 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e377d4_3fae_47df_a36a_dd1885a0e9d8.slice\": RecentStats: unable to find data in memory cache]" Dec 08 10:09:12 crc kubenswrapper[4776]: I1208 10:09:12.037148 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w6qd4"] Dec 08 10:09:12 crc kubenswrapper[4776]: W1208 10:09:12.039451 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f53bd2_4c4f_4f22_bc6b_8379ba894c35.slice/crio-f5c8e482368018ac18efcc61181721644c97bc5aed0f0a77860150a2f7561bbc WatchSource:0}: Error finding container f5c8e482368018ac18efcc61181721644c97bc5aed0f0a77860150a2f7561bbc: Status 404 returned error can't find the container with id f5c8e482368018ac18efcc61181721644c97bc5aed0f0a77860150a2f7561bbc Dec 08 10:09:12 crc kubenswrapper[4776]: I1208 10:09:12.288137 4776 generic.go:334] "Generic (PLEG): container finished" podID="48f53bd2-4c4f-4f22-bc6b-8379ba894c35" containerID="4e1ec315892d93934f2a2d24979fc7a801c770e0496251da04cdabd979db7a0a" exitCode=0 Dec 08 10:09:12 crc kubenswrapper[4776]: I1208 10:09:12.288226 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6qd4" event={"ID":"48f53bd2-4c4f-4f22-bc6b-8379ba894c35","Type":"ContainerDied","Data":"4e1ec315892d93934f2a2d24979fc7a801c770e0496251da04cdabd979db7a0a"} Dec 08 10:09:12 crc kubenswrapper[4776]: I1208 10:09:12.288492 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6qd4" event={"ID":"48f53bd2-4c4f-4f22-bc6b-8379ba894c35","Type":"ContainerStarted","Data":"f5c8e482368018ac18efcc61181721644c97bc5aed0f0a77860150a2f7561bbc"} Dec 08 10:09:13 crc kubenswrapper[4776]: I1208 10:09:13.301147 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6qd4" event={"ID":"48f53bd2-4c4f-4f22-bc6b-8379ba894c35","Type":"ContainerStarted","Data":"bdc1f990e5cfd6dffa15922cf7bf18285f40f18ec238fed16fb734525e130c61"} Dec 08 10:09:14 crc kubenswrapper[4776]: I1208 10:09:14.317726 4776 generic.go:334] "Generic (PLEG): container finished" podID="48f53bd2-4c4f-4f22-bc6b-8379ba894c35" containerID="bdc1f990e5cfd6dffa15922cf7bf18285f40f18ec238fed16fb734525e130c61" exitCode=0 Dec 08 10:09:14 crc kubenswrapper[4776]: I1208 10:09:14.317824 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6qd4" event={"ID":"48f53bd2-4c4f-4f22-bc6b-8379ba894c35","Type":"ContainerDied","Data":"bdc1f990e5cfd6dffa15922cf7bf18285f40f18ec238fed16fb734525e130c61"} Dec 08 10:09:15 crc kubenswrapper[4776]: E1208 10:09:15.234901 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e377d4_3fae_47df_a36a_dd1885a0e9d8.slice\": RecentStats: unable to find data in memory cache]" Dec 08 10:09:15 crc kubenswrapper[4776]: I1208 10:09:15.331525 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6qd4" event={"ID":"48f53bd2-4c4f-4f22-bc6b-8379ba894c35","Type":"ContainerStarted","Data":"cbaf6d137a1bfc6a4fdf08106bb3cca7b7c288fe2d55f61efd4402150f41cf5f"} Dec 08 10:09:15 crc kubenswrapper[4776]: I1208 10:09:15.356107 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w6qd4" podStartSLOduration=1.872522427 podStartE2EDuration="4.356089875s" podCreationTimestamp="2025-12-08 10:09:11 +0000 UTC" firstStartedPulling="2025-12-08 10:09:12.29043234 +0000 UTC m=+4228.553657372" lastFinishedPulling="2025-12-08 10:09:14.773999798 +0000 UTC m=+4231.037224820" observedRunningTime="2025-12-08 10:09:15.354191193 +0000 UTC m=+4231.617416295" watchObservedRunningTime="2025-12-08 10:09:15.356089875 +0000 UTC m=+4231.619314897" Dec 08 10:09:21 crc kubenswrapper[4776]: I1208 10:09:21.439773 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:21 crc kubenswrapper[4776]: I1208 10:09:21.441420 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:21 crc kubenswrapper[4776]: I1208 10:09:21.498050 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:21 crc kubenswrapper[4776]: E1208 10:09:21.847861 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e377d4_3fae_47df_a36a_dd1885a0e9d8.slice\": RecentStats: unable to find data in memory cache]" Dec 08 10:09:22 crc kubenswrapper[4776]: I1208 10:09:22.470975 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:25 crc kubenswrapper[4776]: I1208 10:09:25.941597 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w6qd4"] Dec 08 10:09:25 crc kubenswrapper[4776]: I1208 10:09:25.942449 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w6qd4" podUID="48f53bd2-4c4f-4f22-bc6b-8379ba894c35" containerName="registry-server" containerID="cri-o://cbaf6d137a1bfc6a4fdf08106bb3cca7b7c288fe2d55f61efd4402150f41cf5f" gracePeriod=2 Dec 08 10:09:26 crc kubenswrapper[4776]: I1208 10:09:26.457037 4776 generic.go:334] "Generic (PLEG): container finished" podID="48f53bd2-4c4f-4f22-bc6b-8379ba894c35" containerID="cbaf6d137a1bfc6a4fdf08106bb3cca7b7c288fe2d55f61efd4402150f41cf5f" exitCode=0 Dec 08 10:09:26 crc kubenswrapper[4776]: I1208 10:09:26.457302 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6qd4" event={"ID":"48f53bd2-4c4f-4f22-bc6b-8379ba894c35","Type":"ContainerDied","Data":"cbaf6d137a1bfc6a4fdf08106bb3cca7b7c288fe2d55f61efd4402150f41cf5f"} Dec 08 10:09:26 crc kubenswrapper[4776]: I1208 10:09:26.457328 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6qd4" event={"ID":"48f53bd2-4c4f-4f22-bc6b-8379ba894c35","Type":"ContainerDied","Data":"f5c8e482368018ac18efcc61181721644c97bc5aed0f0a77860150a2f7561bbc"} Dec 08 10:09:26 crc kubenswrapper[4776]: I1208 10:09:26.457340 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5c8e482368018ac18efcc61181721644c97bc5aed0f0a77860150a2f7561bbc" Dec 08 10:09:26 crc kubenswrapper[4776]: I1208 10:09:26.490061 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:26 crc kubenswrapper[4776]: I1208 10:09:26.607379 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-catalog-content\") pod \"48f53bd2-4c4f-4f22-bc6b-8379ba894c35\" (UID: \"48f53bd2-4c4f-4f22-bc6b-8379ba894c35\") " Dec 08 10:09:26 crc kubenswrapper[4776]: I1208 10:09:26.607600 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw6fp\" (UniqueName: \"kubernetes.io/projected/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-kube-api-access-hw6fp\") pod \"48f53bd2-4c4f-4f22-bc6b-8379ba894c35\" (UID: \"48f53bd2-4c4f-4f22-bc6b-8379ba894c35\") " Dec 08 10:09:26 crc kubenswrapper[4776]: I1208 10:09:26.607727 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-utilities\") pod \"48f53bd2-4c4f-4f22-bc6b-8379ba894c35\" (UID: \"48f53bd2-4c4f-4f22-bc6b-8379ba894c35\") " Dec 08 10:09:26 crc kubenswrapper[4776]: I1208 10:09:26.608529 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-utilities" (OuterVolumeSpecName: "utilities") pod "48f53bd2-4c4f-4f22-bc6b-8379ba894c35" (UID: "48f53bd2-4c4f-4f22-bc6b-8379ba894c35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:09:26 crc kubenswrapper[4776]: I1208 10:09:26.608638 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:09:26 crc kubenswrapper[4776]: I1208 10:09:26.616466 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-kube-api-access-hw6fp" (OuterVolumeSpecName: "kube-api-access-hw6fp") pod "48f53bd2-4c4f-4f22-bc6b-8379ba894c35" (UID: "48f53bd2-4c4f-4f22-bc6b-8379ba894c35"). InnerVolumeSpecName "kube-api-access-hw6fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:09:26 crc kubenswrapper[4776]: I1208 10:09:26.659625 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48f53bd2-4c4f-4f22-bc6b-8379ba894c35" (UID: "48f53bd2-4c4f-4f22-bc6b-8379ba894c35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:09:26 crc kubenswrapper[4776]: I1208 10:09:26.711947 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:09:26 crc kubenswrapper[4776]: I1208 10:09:26.711988 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw6fp\" (UniqueName: \"kubernetes.io/projected/48f53bd2-4c4f-4f22-bc6b-8379ba894c35-kube-api-access-hw6fp\") on node \"crc\" DevicePath \"\"" Dec 08 10:09:27 crc kubenswrapper[4776]: I1208 10:09:27.467620 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6qd4" Dec 08 10:09:27 crc kubenswrapper[4776]: I1208 10:09:27.503863 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w6qd4"] Dec 08 10:09:27 crc kubenswrapper[4776]: I1208 10:09:27.514460 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w6qd4"] Dec 08 10:09:28 crc kubenswrapper[4776]: I1208 10:09:28.357843 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f53bd2-4c4f-4f22-bc6b-8379ba894c35" path="/var/lib/kubelet/pods/48f53bd2-4c4f-4f22-bc6b-8379ba894c35/volumes" Dec 08 10:09:30 crc kubenswrapper[4776]: E1208 10:09:30.510455 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e377d4_3fae_47df_a36a_dd1885a0e9d8.slice\": RecentStats: unable to find data in memory cache]" Dec 08 10:09:31 crc kubenswrapper[4776]: E1208 10:09:31.898842 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e377d4_3fae_47df_a36a_dd1885a0e9d8.slice\": RecentStats: unable to find data in memory cache]" Dec 08 10:09:42 crc kubenswrapper[4776]: E1208 10:09:42.224155 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e377d4_3fae_47df_a36a_dd1885a0e9d8.slice\": RecentStats: unable to find data in memory cache]" Dec 08 10:10:11 crc kubenswrapper[4776]: I1208 10:10:11.399426 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:10:11 crc kubenswrapper[4776]: I1208 10:10:11.399953 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:10:16 crc kubenswrapper[4776]: E1208 10:10:16.557635 4776 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.82:60416->38.102.83.82:46339: write tcp 38.102.83.82:60416->38.102.83.82:46339: write: connection reset by peer Dec 08 10:10:41 crc kubenswrapper[4776]: I1208 10:10:41.399604 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:10:41 crc kubenswrapper[4776]: I1208 10:10:41.400141 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:11:11 crc kubenswrapper[4776]: I1208 10:11:11.398664 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:11:11 crc kubenswrapper[4776]: I1208 10:11:11.399088 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:11:11 crc kubenswrapper[4776]: I1208 10:11:11.399125 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 10:11:11 crc kubenswrapper[4776]: I1208 10:11:11.399598 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 10:11:11 crc kubenswrapper[4776]: I1208 10:11:11.399644 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" gracePeriod=600 Dec 08 10:11:11 crc kubenswrapper[4776]: E1208 10:11:11.526697 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:11:11 crc kubenswrapper[4776]: I1208 10:11:11.539696 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" exitCode=0 Dec 08 10:11:11 crc kubenswrapper[4776]: I1208 10:11:11.539749 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c"} Dec 08 10:11:11 crc kubenswrapper[4776]: I1208 10:11:11.539786 4776 scope.go:117] "RemoveContainer" containerID="90adb4348f501214a70951f12e6327c6cd5766de3c2a25e42b686c34dce1ea8f" Dec 08 10:11:11 crc kubenswrapper[4776]: I1208 10:11:11.541709 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:11:11 crc kubenswrapper[4776]: E1208 10:11:11.542135 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:11:22 crc kubenswrapper[4776]: I1208 10:11:22.344474 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:11:22 crc kubenswrapper[4776]: E1208 10:11:22.346691 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:11:33 crc kubenswrapper[4776]: I1208 10:11:33.344317 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:11:33 crc kubenswrapper[4776]: E1208 10:11:33.345234 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:11:46 crc kubenswrapper[4776]: I1208 10:11:46.344115 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:11:46 crc kubenswrapper[4776]: E1208 10:11:46.345079 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:11:57 crc kubenswrapper[4776]: I1208 10:11:57.344165 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:11:57 crc kubenswrapper[4776]: E1208 10:11:57.347409 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:12:10 crc kubenswrapper[4776]: I1208 10:12:10.348444 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:12:10 crc kubenswrapper[4776]: E1208 10:12:10.349746 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:12:24 crc kubenswrapper[4776]: I1208 10:12:24.352674 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:12:24 crc kubenswrapper[4776]: E1208 10:12:24.353491 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:12:38 crc kubenswrapper[4776]: I1208 10:12:38.343551 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:12:38 crc kubenswrapper[4776]: E1208 10:12:38.344426 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:12:51 crc kubenswrapper[4776]: I1208 10:12:51.344660 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:12:51 crc kubenswrapper[4776]: E1208 10:12:51.345476 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:13:03 crc kubenswrapper[4776]: I1208 10:13:03.345137 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:13:03 crc kubenswrapper[4776]: E1208 10:13:03.346142 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:13:15 crc kubenswrapper[4776]: I1208 10:13:15.343615 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:13:15 crc kubenswrapper[4776]: E1208 10:13:15.344619 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:13:26 crc kubenswrapper[4776]: I1208 10:13:26.343753 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:13:26 crc kubenswrapper[4776]: E1208 10:13:26.344753 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:13:39 crc kubenswrapper[4776]: I1208 10:13:39.343582 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:13:39 crc kubenswrapper[4776]: E1208 10:13:39.345353 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:13:51 crc kubenswrapper[4776]: I1208 10:13:51.343811 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:13:51 crc kubenswrapper[4776]: E1208 10:13:51.344911 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:14:02 crc kubenswrapper[4776]: I1208 10:14:02.349833 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:14:02 crc kubenswrapper[4776]: E1208 10:14:02.351029 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:14:16 crc kubenswrapper[4776]: I1208 10:14:16.343930 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:14:16 crc kubenswrapper[4776]: E1208 10:14:16.344815 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:14:27 crc kubenswrapper[4776]: I1208 10:14:27.344561 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:14:27 crc kubenswrapper[4776]: E1208 10:14:27.345697 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:14:41 crc kubenswrapper[4776]: I1208 10:14:41.343557 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:14:41 crc kubenswrapper[4776]: E1208 10:14:41.344537 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:14:52 crc kubenswrapper[4776]: I1208 10:14:52.344508 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:14:52 crc kubenswrapper[4776]: E1208 10:14:52.345398 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.171095 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v"] Dec 08 10:15:00 crc kubenswrapper[4776]: E1208 10:15:00.173953 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f53bd2-4c4f-4f22-bc6b-8379ba894c35" containerName="extract-utilities" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.174164 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f53bd2-4c4f-4f22-bc6b-8379ba894c35" containerName="extract-utilities" Dec 08 10:15:00 crc kubenswrapper[4776]: E1208 10:15:00.174319 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f53bd2-4c4f-4f22-bc6b-8379ba894c35" containerName="extract-content" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.174499 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f53bd2-4c4f-4f22-bc6b-8379ba894c35" containerName="extract-content" Dec 08 10:15:00 crc kubenswrapper[4776]: E1208 10:15:00.174613 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f53bd2-4c4f-4f22-bc6b-8379ba894c35" containerName="registry-server" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.174723 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f53bd2-4c4f-4f22-bc6b-8379ba894c35" containerName="registry-server" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.175300 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f53bd2-4c4f-4f22-bc6b-8379ba894c35" containerName="registry-server" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.176244 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.178728 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.182140 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.206840 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v"] Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.251243 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq2kp\" (UniqueName: \"kubernetes.io/projected/7f4f6e93-2983-430e-9f4a-1087cc78429f-kube-api-access-cq2kp\") pod \"collect-profiles-29419815-jl58v\" (UID: \"7f4f6e93-2983-430e-9f4a-1087cc78429f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.251730 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7f4f6e93-2983-430e-9f4a-1087cc78429f-secret-volume\") pod \"collect-profiles-29419815-jl58v\" (UID: \"7f4f6e93-2983-430e-9f4a-1087cc78429f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.251783 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f4f6e93-2983-430e-9f4a-1087cc78429f-config-volume\") pod \"collect-profiles-29419815-jl58v\" (UID: \"7f4f6e93-2983-430e-9f4a-1087cc78429f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.354251 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7f4f6e93-2983-430e-9f4a-1087cc78429f-secret-volume\") pod \"collect-profiles-29419815-jl58v\" (UID: \"7f4f6e93-2983-430e-9f4a-1087cc78429f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.354311 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f4f6e93-2983-430e-9f4a-1087cc78429f-config-volume\") pod \"collect-profiles-29419815-jl58v\" (UID: \"7f4f6e93-2983-430e-9f4a-1087cc78429f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.354358 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq2kp\" (UniqueName: \"kubernetes.io/projected/7f4f6e93-2983-430e-9f4a-1087cc78429f-kube-api-access-cq2kp\") pod \"collect-profiles-29419815-jl58v\" (UID: \"7f4f6e93-2983-430e-9f4a-1087cc78429f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.355535 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f4f6e93-2983-430e-9f4a-1087cc78429f-config-volume\") pod \"collect-profiles-29419815-jl58v\" (UID: \"7f4f6e93-2983-430e-9f4a-1087cc78429f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.370676 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7f4f6e93-2983-430e-9f4a-1087cc78429f-secret-volume\") pod \"collect-profiles-29419815-jl58v\" (UID: \"7f4f6e93-2983-430e-9f4a-1087cc78429f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.377488 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq2kp\" (UniqueName: \"kubernetes.io/projected/7f4f6e93-2983-430e-9f4a-1087cc78429f-kube-api-access-cq2kp\") pod \"collect-profiles-29419815-jl58v\" (UID: \"7f4f6e93-2983-430e-9f4a-1087cc78429f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.509201 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v" Dec 08 10:15:00 crc kubenswrapper[4776]: I1208 10:15:00.987339 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v"] Dec 08 10:15:01 crc kubenswrapper[4776]: I1208 10:15:01.017873 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v" event={"ID":"7f4f6e93-2983-430e-9f4a-1087cc78429f","Type":"ContainerStarted","Data":"2e003af76090c6ec957539b3e6d4ccdf8cdfab6e44a8c617ceb66839fcdb89c8"} Dec 08 10:15:02 crc kubenswrapper[4776]: I1208 10:15:02.030453 4776 generic.go:334] "Generic (PLEG): container finished" podID="7f4f6e93-2983-430e-9f4a-1087cc78429f" containerID="faa1be26d04f80634a9b9f33645d6f6a9626482d2c65b0740aed44a2c8353d8d" exitCode=0 Dec 08 10:15:02 crc kubenswrapper[4776]: I1208 10:15:02.030531 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v" event={"ID":"7f4f6e93-2983-430e-9f4a-1087cc78429f","Type":"ContainerDied","Data":"faa1be26d04f80634a9b9f33645d6f6a9626482d2c65b0740aed44a2c8353d8d"} Dec 08 10:15:03 crc kubenswrapper[4776]: I1208 10:15:03.478035 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v" Dec 08 10:15:03 crc kubenswrapper[4776]: I1208 10:15:03.632625 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f4f6e93-2983-430e-9f4a-1087cc78429f-config-volume\") pod \"7f4f6e93-2983-430e-9f4a-1087cc78429f\" (UID: \"7f4f6e93-2983-430e-9f4a-1087cc78429f\") " Dec 08 10:15:03 crc kubenswrapper[4776]: I1208 10:15:03.632694 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq2kp\" (UniqueName: \"kubernetes.io/projected/7f4f6e93-2983-430e-9f4a-1087cc78429f-kube-api-access-cq2kp\") pod \"7f4f6e93-2983-430e-9f4a-1087cc78429f\" (UID: \"7f4f6e93-2983-430e-9f4a-1087cc78429f\") " Dec 08 10:15:03 crc kubenswrapper[4776]: I1208 10:15:03.632758 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7f4f6e93-2983-430e-9f4a-1087cc78429f-secret-volume\") pod \"7f4f6e93-2983-430e-9f4a-1087cc78429f\" (UID: \"7f4f6e93-2983-430e-9f4a-1087cc78429f\") " Dec 08 10:15:03 crc kubenswrapper[4776]: I1208 10:15:03.634622 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f4f6e93-2983-430e-9f4a-1087cc78429f-config-volume" (OuterVolumeSpecName: "config-volume") pod "7f4f6e93-2983-430e-9f4a-1087cc78429f" (UID: "7f4f6e93-2983-430e-9f4a-1087cc78429f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 10:15:03 crc kubenswrapper[4776]: I1208 10:15:03.640679 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f4f6e93-2983-430e-9f4a-1087cc78429f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7f4f6e93-2983-430e-9f4a-1087cc78429f" (UID: "7f4f6e93-2983-430e-9f4a-1087cc78429f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 10:15:03 crc kubenswrapper[4776]: I1208 10:15:03.641011 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f4f6e93-2983-430e-9f4a-1087cc78429f-kube-api-access-cq2kp" (OuterVolumeSpecName: "kube-api-access-cq2kp") pod "7f4f6e93-2983-430e-9f4a-1087cc78429f" (UID: "7f4f6e93-2983-430e-9f4a-1087cc78429f"). InnerVolumeSpecName "kube-api-access-cq2kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:15:03 crc kubenswrapper[4776]: I1208 10:15:03.735499 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f4f6e93-2983-430e-9f4a-1087cc78429f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 08 10:15:03 crc kubenswrapper[4776]: I1208 10:15:03.735543 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq2kp\" (UniqueName: \"kubernetes.io/projected/7f4f6e93-2983-430e-9f4a-1087cc78429f-kube-api-access-cq2kp\") on node \"crc\" DevicePath \"\"" Dec 08 10:15:03 crc kubenswrapper[4776]: I1208 10:15:03.735557 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7f4f6e93-2983-430e-9f4a-1087cc78429f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 08 10:15:04 crc kubenswrapper[4776]: I1208 10:15:04.052400 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v" event={"ID":"7f4f6e93-2983-430e-9f4a-1087cc78429f","Type":"ContainerDied","Data":"2e003af76090c6ec957539b3e6d4ccdf8cdfab6e44a8c617ceb66839fcdb89c8"} Dec 08 10:15:04 crc kubenswrapper[4776]: I1208 10:15:04.052758 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e003af76090c6ec957539b3e6d4ccdf8cdfab6e44a8c617ceb66839fcdb89c8" Dec 08 10:15:04 crc kubenswrapper[4776]: I1208 10:15:04.052448 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419815-jl58v" Dec 08 10:15:04 crc kubenswrapper[4776]: I1208 10:15:04.351516 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:15:04 crc kubenswrapper[4776]: E1208 10:15:04.352056 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:15:04 crc kubenswrapper[4776]: I1208 10:15:04.563746 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q"] Dec 08 10:15:04 crc kubenswrapper[4776]: I1208 10:15:04.575133 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419770-58r5q"] Dec 08 10:15:06 crc kubenswrapper[4776]: I1208 10:15:06.362920 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54152b91-017c-44e9-8c79-f0cf0befb065" path="/var/lib/kubelet/pods/54152b91-017c-44e9-8c79-f0cf0befb065/volumes" Dec 08 10:15:16 crc kubenswrapper[4776]: I1208 10:15:16.392547 4776 trace.go:236] Trace[913585230]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-index-gateway-0" (08-Dec-2025 10:15:15.158) (total time: 1233ms): Dec 08 10:15:16 crc kubenswrapper[4776]: Trace[913585230]: [1.233356249s] [1.233356249s] END Dec 08 10:15:17 crc kubenswrapper[4776]: I1208 10:15:17.344477 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:15:17 crc kubenswrapper[4776]: E1208 10:15:17.345049 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:15:19 crc kubenswrapper[4776]: I1208 10:15:19.211665 4776 scope.go:117] "RemoveContainer" containerID="cbaf6d137a1bfc6a4fdf08106bb3cca7b7c288fe2d55f61efd4402150f41cf5f" Dec 08 10:15:19 crc kubenswrapper[4776]: I1208 10:15:19.236044 4776 scope.go:117] "RemoveContainer" containerID="4e1ec315892d93934f2a2d24979fc7a801c770e0496251da04cdabd979db7a0a" Dec 08 10:15:19 crc kubenswrapper[4776]: I1208 10:15:19.259986 4776 scope.go:117] "RemoveContainer" containerID="bdc1f990e5cfd6dffa15922cf7bf18285f40f18ec238fed16fb734525e130c61" Dec 08 10:15:19 crc kubenswrapper[4776]: I1208 10:15:19.319793 4776 scope.go:117] "RemoveContainer" containerID="61e1263b8f4ce9bcca325447358f5ff8611804f1cac07fd9c2079efc299187e8" Dec 08 10:15:30 crc kubenswrapper[4776]: I1208 10:15:30.343510 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:15:30 crc kubenswrapper[4776]: E1208 10:15:30.344372 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:15:43 crc kubenswrapper[4776]: I1208 10:15:43.344579 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:15:43 crc kubenswrapper[4776]: E1208 10:15:43.345718 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:15:54 crc kubenswrapper[4776]: I1208 10:15:54.353225 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:15:54 crc kubenswrapper[4776]: E1208 10:15:54.354195 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.532319 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 08 10:15:56 crc kubenswrapper[4776]: E1208 10:15:56.533855 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f4f6e93-2983-430e-9f4a-1087cc78429f" containerName="collect-profiles" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.533896 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f4f6e93-2983-430e-9f4a-1087cc78429f" containerName="collect-profiles" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.534254 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f4f6e93-2983-430e-9f4a-1087cc78429f" containerName="collect-profiles" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.535410 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.538411 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8cddf" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.538484 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.538754 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.539055 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.545478 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.634395 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.634728 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8fp8\" (UniqueName: \"kubernetes.io/projected/9c3d4f25-4353-4b82-8de9-ee14a2f05076-kube-api-access-q8fp8\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.634796 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9c3d4f25-4353-4b82-8de9-ee14a2f05076-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.634962 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.635046 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c3d4f25-4353-4b82-8de9-ee14a2f05076-config-data\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.635151 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9c3d4f25-4353-4b82-8de9-ee14a2f05076-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.635305 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.635481 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9c3d4f25-4353-4b82-8de9-ee14a2f05076-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.635555 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.737258 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c3d4f25-4353-4b82-8de9-ee14a2f05076-config-data\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.737352 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9c3d4f25-4353-4b82-8de9-ee14a2f05076-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.737388 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.737446 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9c3d4f25-4353-4b82-8de9-ee14a2f05076-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.737487 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.737514 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.737542 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8fp8\" (UniqueName: \"kubernetes.io/projected/9c3d4f25-4353-4b82-8de9-ee14a2f05076-kube-api-access-q8fp8\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.737594 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9c3d4f25-4353-4b82-8de9-ee14a2f05076-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.737677 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.737906 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9c3d4f25-4353-4b82-8de9-ee14a2f05076-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.738426 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9c3d4f25-4353-4b82-8de9-ee14a2f05076-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.738571 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9c3d4f25-4353-4b82-8de9-ee14a2f05076-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.739009 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.739412 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c3d4f25-4353-4b82-8de9-ee14a2f05076-config-data\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.953037 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.957742 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.959333 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.982939 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8fp8\" (UniqueName: \"kubernetes.io/projected/9c3d4f25-4353-4b82-8de9-ee14a2f05076-kube-api-access-q8fp8\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:56 crc kubenswrapper[4776]: I1208 10:15:56.994592 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " pod="openstack/tempest-tests-tempest" Dec 08 10:15:57 crc kubenswrapper[4776]: I1208 10:15:57.163734 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 08 10:15:57 crc kubenswrapper[4776]: I1208 10:15:57.620759 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 08 10:15:57 crc kubenswrapper[4776]: I1208 10:15:57.628693 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 10:15:58 crc kubenswrapper[4776]: I1208 10:15:58.630601 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9c3d4f25-4353-4b82-8de9-ee14a2f05076","Type":"ContainerStarted","Data":"a81d3bad193bc41d0862cc4e230258edaec8cc554e77f509de10211782b648c2"} Dec 08 10:16:09 crc kubenswrapper[4776]: I1208 10:16:09.345065 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:16:09 crc kubenswrapper[4776]: E1208 10:16:09.346623 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:16:20 crc kubenswrapper[4776]: I1208 10:16:20.344584 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:16:25 crc kubenswrapper[4776]: E1208 10:16:25.849632 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 08 10:16:25 crc kubenswrapper[4776]: E1208 10:16:25.851640 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q8fp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(9c3d4f25-4353-4b82-8de9-ee14a2f05076): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 10:16:25 crc kubenswrapper[4776]: E1208 10:16:25.852869 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="9c3d4f25-4353-4b82-8de9-ee14a2f05076" Dec 08 10:16:25 crc kubenswrapper[4776]: E1208 10:16:25.956095 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="9c3d4f25-4353-4b82-8de9-ee14a2f05076" Dec 08 10:16:26 crc kubenswrapper[4776]: I1208 10:16:26.966370 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"25f229d16585db3e47cf1694dd543abb708db87dbc5a4391839049281b460bc9"} Dec 08 10:16:41 crc kubenswrapper[4776]: I1208 10:16:41.109690 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9c3d4f25-4353-4b82-8de9-ee14a2f05076","Type":"ContainerStarted","Data":"157cf2545ec6bd46b1bfefd27109b4c9ebba70710924cdeb18081d269f535371"} Dec 08 10:16:41 crc kubenswrapper[4776]: I1208 10:16:41.143975 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.974697235 podStartE2EDuration="46.143948249s" podCreationTimestamp="2025-12-08 10:15:55 +0000 UTC" firstStartedPulling="2025-12-08 10:15:57.628223952 +0000 UTC m=+4633.891448974" lastFinishedPulling="2025-12-08 10:16:39.797474966 +0000 UTC m=+4676.060699988" observedRunningTime="2025-12-08 10:16:41.125506454 +0000 UTC m=+4677.388731476" watchObservedRunningTime="2025-12-08 10:16:41.143948249 +0000 UTC m=+4677.407173271" Dec 08 10:18:41 crc kubenswrapper[4776]: I1208 10:18:41.399940 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:18:41 crc kubenswrapper[4776]: I1208 10:18:41.401372 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:18:52 crc kubenswrapper[4776]: I1208 10:18:52.703732 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-78j2g"] Dec 08 10:18:52 crc kubenswrapper[4776]: I1208 10:18:52.709314 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:18:52 crc kubenswrapper[4776]: I1208 10:18:52.789998 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt9bx\" (UniqueName: \"kubernetes.io/projected/9d3ff118-25c8-43d3-87ad-437fe201abf6-kube-api-access-gt9bx\") pod \"community-operators-78j2g\" (UID: \"9d3ff118-25c8-43d3-87ad-437fe201abf6\") " pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:18:52 crc kubenswrapper[4776]: I1208 10:18:52.790069 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d3ff118-25c8-43d3-87ad-437fe201abf6-catalog-content\") pod \"community-operators-78j2g\" (UID: \"9d3ff118-25c8-43d3-87ad-437fe201abf6\") " pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:18:52 crc kubenswrapper[4776]: I1208 10:18:52.790100 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d3ff118-25c8-43d3-87ad-437fe201abf6-utilities\") pod \"community-operators-78j2g\" (UID: \"9d3ff118-25c8-43d3-87ad-437fe201abf6\") " pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:18:52 crc kubenswrapper[4776]: I1208 10:18:52.892070 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt9bx\" (UniqueName: \"kubernetes.io/projected/9d3ff118-25c8-43d3-87ad-437fe201abf6-kube-api-access-gt9bx\") pod \"community-operators-78j2g\" (UID: \"9d3ff118-25c8-43d3-87ad-437fe201abf6\") " pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:18:52 crc kubenswrapper[4776]: I1208 10:18:52.892131 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d3ff118-25c8-43d3-87ad-437fe201abf6-catalog-content\") pod \"community-operators-78j2g\" (UID: \"9d3ff118-25c8-43d3-87ad-437fe201abf6\") " pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:18:52 crc kubenswrapper[4776]: I1208 10:18:52.892156 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d3ff118-25c8-43d3-87ad-437fe201abf6-utilities\") pod \"community-operators-78j2g\" (UID: \"9d3ff118-25c8-43d3-87ad-437fe201abf6\") " pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:18:52 crc kubenswrapper[4776]: I1208 10:18:52.893762 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d3ff118-25c8-43d3-87ad-437fe201abf6-utilities\") pod \"community-operators-78j2g\" (UID: \"9d3ff118-25c8-43d3-87ad-437fe201abf6\") " pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:18:52 crc kubenswrapper[4776]: I1208 10:18:52.894089 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d3ff118-25c8-43d3-87ad-437fe201abf6-catalog-content\") pod \"community-operators-78j2g\" (UID: \"9d3ff118-25c8-43d3-87ad-437fe201abf6\") " pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:18:52 crc kubenswrapper[4776]: I1208 10:18:52.924837 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-78j2g"] Dec 08 10:18:52 crc kubenswrapper[4776]: I1208 10:18:52.926276 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt9bx\" (UniqueName: \"kubernetes.io/projected/9d3ff118-25c8-43d3-87ad-437fe201abf6-kube-api-access-gt9bx\") pod \"community-operators-78j2g\" (UID: \"9d3ff118-25c8-43d3-87ad-437fe201abf6\") " pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:18:53 crc kubenswrapper[4776]: I1208 10:18:53.032560 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:18:53 crc kubenswrapper[4776]: I1208 10:18:53.818538 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-78j2g"] Dec 08 10:18:54 crc kubenswrapper[4776]: I1208 10:18:54.543436 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78j2g" event={"ID":"9d3ff118-25c8-43d3-87ad-437fe201abf6","Type":"ContainerDied","Data":"a5d34d183f589076615ff79744c5e7dcae62951f7757714a90b04b1fe84cf609"} Dec 08 10:18:54 crc kubenswrapper[4776]: I1208 10:18:54.543610 4776 generic.go:334] "Generic (PLEG): container finished" podID="9d3ff118-25c8-43d3-87ad-437fe201abf6" containerID="a5d34d183f589076615ff79744c5e7dcae62951f7757714a90b04b1fe84cf609" exitCode=0 Dec 08 10:18:54 crc kubenswrapper[4776]: I1208 10:18:54.543891 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78j2g" event={"ID":"9d3ff118-25c8-43d3-87ad-437fe201abf6","Type":"ContainerStarted","Data":"25a512c68dca42e8a0951b13060f7a270aa6a5c0caec7adee4797c32b97ae0f3"} Dec 08 10:18:56 crc kubenswrapper[4776]: I1208 10:18:56.577093 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78j2g" event={"ID":"9d3ff118-25c8-43d3-87ad-437fe201abf6","Type":"ContainerStarted","Data":"c865ae1eb668a3fc5ac9d932d7ded47921b774d5aa69d9101c86420cba2ee678"} Dec 08 10:18:57 crc kubenswrapper[4776]: I1208 10:18:57.588677 4776 generic.go:334] "Generic (PLEG): container finished" podID="9d3ff118-25c8-43d3-87ad-437fe201abf6" containerID="c865ae1eb668a3fc5ac9d932d7ded47921b774d5aa69d9101c86420cba2ee678" exitCode=0 Dec 08 10:18:57 crc kubenswrapper[4776]: I1208 10:18:57.588756 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78j2g" event={"ID":"9d3ff118-25c8-43d3-87ad-437fe201abf6","Type":"ContainerDied","Data":"c865ae1eb668a3fc5ac9d932d7ded47921b774d5aa69d9101c86420cba2ee678"} Dec 08 10:18:58 crc kubenswrapper[4776]: I1208 10:18:58.600846 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78j2g" event={"ID":"9d3ff118-25c8-43d3-87ad-437fe201abf6","Type":"ContainerStarted","Data":"d0327c9a598797c595199e68afe777aba529989ad81fbf74f53a1d0d1835a915"} Dec 08 10:18:58 crc kubenswrapper[4776]: I1208 10:18:58.621464 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-78j2g" podStartSLOduration=3.194289963 podStartE2EDuration="6.621159141s" podCreationTimestamp="2025-12-08 10:18:52 +0000 UTC" firstStartedPulling="2025-12-08 10:18:54.545906077 +0000 UTC m=+4810.809131099" lastFinishedPulling="2025-12-08 10:18:57.972775255 +0000 UTC m=+4814.236000277" observedRunningTime="2025-12-08 10:18:58.61932601 +0000 UTC m=+4814.882551042" watchObservedRunningTime="2025-12-08 10:18:58.621159141 +0000 UTC m=+4814.884384163" Dec 08 10:19:03 crc kubenswrapper[4776]: I1208 10:19:03.032917 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:19:03 crc kubenswrapper[4776]: I1208 10:19:03.034398 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:19:04 crc kubenswrapper[4776]: I1208 10:19:04.088488 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-78j2g" podUID="9d3ff118-25c8-43d3-87ad-437fe201abf6" containerName="registry-server" probeResult="failure" output=< Dec 08 10:19:04 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 10:19:04 crc kubenswrapper[4776]: > Dec 08 10:19:08 crc kubenswrapper[4776]: I1208 10:19:08.887847 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-brdh2"] Dec 08 10:19:08 crc kubenswrapper[4776]: I1208 10:19:08.895336 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:08 crc kubenswrapper[4776]: I1208 10:19:08.986809 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0736884-b859-47c4-8c0e-99e40185f52a-utilities\") pod \"redhat-operators-brdh2\" (UID: \"d0736884-b859-47c4-8c0e-99e40185f52a\") " pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:08 crc kubenswrapper[4776]: I1208 10:19:08.986938 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7tjz\" (UniqueName: \"kubernetes.io/projected/d0736884-b859-47c4-8c0e-99e40185f52a-kube-api-access-b7tjz\") pod \"redhat-operators-brdh2\" (UID: \"d0736884-b859-47c4-8c0e-99e40185f52a\") " pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:08 crc kubenswrapper[4776]: I1208 10:19:08.986977 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0736884-b859-47c4-8c0e-99e40185f52a-catalog-content\") pod \"redhat-operators-brdh2\" (UID: \"d0736884-b859-47c4-8c0e-99e40185f52a\") " pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:09 crc kubenswrapper[4776]: I1208 10:19:09.089601 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0736884-b859-47c4-8c0e-99e40185f52a-utilities\") pod \"redhat-operators-brdh2\" (UID: \"d0736884-b859-47c4-8c0e-99e40185f52a\") " pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:09 crc kubenswrapper[4776]: I1208 10:19:09.089718 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7tjz\" (UniqueName: \"kubernetes.io/projected/d0736884-b859-47c4-8c0e-99e40185f52a-kube-api-access-b7tjz\") pod \"redhat-operators-brdh2\" (UID: \"d0736884-b859-47c4-8c0e-99e40185f52a\") " pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:09 crc kubenswrapper[4776]: I1208 10:19:09.089760 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0736884-b859-47c4-8c0e-99e40185f52a-catalog-content\") pod \"redhat-operators-brdh2\" (UID: \"d0736884-b859-47c4-8c0e-99e40185f52a\") " pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:09 crc kubenswrapper[4776]: I1208 10:19:09.094002 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0736884-b859-47c4-8c0e-99e40185f52a-utilities\") pod \"redhat-operators-brdh2\" (UID: \"d0736884-b859-47c4-8c0e-99e40185f52a\") " pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:09 crc kubenswrapper[4776]: I1208 10:19:09.094928 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0736884-b859-47c4-8c0e-99e40185f52a-catalog-content\") pod \"redhat-operators-brdh2\" (UID: \"d0736884-b859-47c4-8c0e-99e40185f52a\") " pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:09 crc kubenswrapper[4776]: I1208 10:19:09.128865 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-brdh2"] Dec 08 10:19:09 crc kubenswrapper[4776]: I1208 10:19:09.139623 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7tjz\" (UniqueName: \"kubernetes.io/projected/d0736884-b859-47c4-8c0e-99e40185f52a-kube-api-access-b7tjz\") pod \"redhat-operators-brdh2\" (UID: \"d0736884-b859-47c4-8c0e-99e40185f52a\") " pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:09 crc kubenswrapper[4776]: I1208 10:19:09.224720 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:10 crc kubenswrapper[4776]: I1208 10:19:10.110039 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-brdh2"] Dec 08 10:19:10 crc kubenswrapper[4776]: I1208 10:19:10.761151 4776 generic.go:334] "Generic (PLEG): container finished" podID="d0736884-b859-47c4-8c0e-99e40185f52a" containerID="dcea67017c22e5266dc84df2ff930ee54538d9849aee2d49d683cae803cedc9e" exitCode=0 Dec 08 10:19:10 crc kubenswrapper[4776]: I1208 10:19:10.762356 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brdh2" event={"ID":"d0736884-b859-47c4-8c0e-99e40185f52a","Type":"ContainerDied","Data":"dcea67017c22e5266dc84df2ff930ee54538d9849aee2d49d683cae803cedc9e"} Dec 08 10:19:10 crc kubenswrapper[4776]: I1208 10:19:10.762483 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brdh2" event={"ID":"d0736884-b859-47c4-8c0e-99e40185f52a","Type":"ContainerStarted","Data":"dd52045e210a76e3161c7c2070b0d940f59436960d4c7c14137ff1e43766b1bb"} Dec 08 10:19:11 crc kubenswrapper[4776]: I1208 10:19:11.399529 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:19:11 crc kubenswrapper[4776]: I1208 10:19:11.399952 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:19:11 crc kubenswrapper[4776]: I1208 10:19:11.774066 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brdh2" event={"ID":"d0736884-b859-47c4-8c0e-99e40185f52a","Type":"ContainerStarted","Data":"58641cd1d8ac65a6ec865b673a3c8c686af5e477ebaa1ddb05dc2e93beb130cc"} Dec 08 10:19:13 crc kubenswrapper[4776]: I1208 10:19:13.283493 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:19:13 crc kubenswrapper[4776]: I1208 10:19:13.352064 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:19:14 crc kubenswrapper[4776]: I1208 10:19:14.185224 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-78j2g"] Dec 08 10:19:14 crc kubenswrapper[4776]: I1208 10:19:14.839285 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-78j2g" podUID="9d3ff118-25c8-43d3-87ad-437fe201abf6" containerName="registry-server" containerID="cri-o://d0327c9a598797c595199e68afe777aba529989ad81fbf74f53a1d0d1835a915" gracePeriod=2 Dec 08 10:19:15 crc kubenswrapper[4776]: I1208 10:19:15.849071 4776 generic.go:334] "Generic (PLEG): container finished" podID="9d3ff118-25c8-43d3-87ad-437fe201abf6" containerID="d0327c9a598797c595199e68afe777aba529989ad81fbf74f53a1d0d1835a915" exitCode=0 Dec 08 10:19:15 crc kubenswrapper[4776]: I1208 10:19:15.849226 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78j2g" event={"ID":"9d3ff118-25c8-43d3-87ad-437fe201abf6","Type":"ContainerDied","Data":"d0327c9a598797c595199e68afe777aba529989ad81fbf74f53a1d0d1835a915"} Dec 08 10:19:16 crc kubenswrapper[4776]: I1208 10:19:16.863526 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78j2g" event={"ID":"9d3ff118-25c8-43d3-87ad-437fe201abf6","Type":"ContainerDied","Data":"25a512c68dca42e8a0951b13060f7a270aa6a5c0caec7adee4797c32b97ae0f3"} Dec 08 10:19:16 crc kubenswrapper[4776]: I1208 10:19:16.865628 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25a512c68dca42e8a0951b13060f7a270aa6a5c0caec7adee4797c32b97ae0f3" Dec 08 10:19:16 crc kubenswrapper[4776]: I1208 10:19:16.888688 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:19:17 crc kubenswrapper[4776]: I1208 10:19:17.081032 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d3ff118-25c8-43d3-87ad-437fe201abf6-utilities\") pod \"9d3ff118-25c8-43d3-87ad-437fe201abf6\" (UID: \"9d3ff118-25c8-43d3-87ad-437fe201abf6\") " Dec 08 10:19:17 crc kubenswrapper[4776]: I1208 10:19:17.081455 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt9bx\" (UniqueName: \"kubernetes.io/projected/9d3ff118-25c8-43d3-87ad-437fe201abf6-kube-api-access-gt9bx\") pod \"9d3ff118-25c8-43d3-87ad-437fe201abf6\" (UID: \"9d3ff118-25c8-43d3-87ad-437fe201abf6\") " Dec 08 10:19:17 crc kubenswrapper[4776]: I1208 10:19:17.081684 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d3ff118-25c8-43d3-87ad-437fe201abf6-catalog-content\") pod \"9d3ff118-25c8-43d3-87ad-437fe201abf6\" (UID: \"9d3ff118-25c8-43d3-87ad-437fe201abf6\") " Dec 08 10:19:17 crc kubenswrapper[4776]: I1208 10:19:17.083398 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d3ff118-25c8-43d3-87ad-437fe201abf6-utilities" (OuterVolumeSpecName: "utilities") pod "9d3ff118-25c8-43d3-87ad-437fe201abf6" (UID: "9d3ff118-25c8-43d3-87ad-437fe201abf6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:19:17 crc kubenswrapper[4776]: I1208 10:19:17.098713 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d3ff118-25c8-43d3-87ad-437fe201abf6-kube-api-access-gt9bx" (OuterVolumeSpecName: "kube-api-access-gt9bx") pod "9d3ff118-25c8-43d3-87ad-437fe201abf6" (UID: "9d3ff118-25c8-43d3-87ad-437fe201abf6"). InnerVolumeSpecName "kube-api-access-gt9bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:19:17 crc kubenswrapper[4776]: I1208 10:19:17.138569 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d3ff118-25c8-43d3-87ad-437fe201abf6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d3ff118-25c8-43d3-87ad-437fe201abf6" (UID: "9d3ff118-25c8-43d3-87ad-437fe201abf6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:19:17 crc kubenswrapper[4776]: I1208 10:19:17.184829 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d3ff118-25c8-43d3-87ad-437fe201abf6-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:19:17 crc kubenswrapper[4776]: I1208 10:19:17.184879 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt9bx\" (UniqueName: \"kubernetes.io/projected/9d3ff118-25c8-43d3-87ad-437fe201abf6-kube-api-access-gt9bx\") on node \"crc\" DevicePath \"\"" Dec 08 10:19:17 crc kubenswrapper[4776]: I1208 10:19:17.184895 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d3ff118-25c8-43d3-87ad-437fe201abf6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:19:17 crc kubenswrapper[4776]: I1208 10:19:17.878866 4776 generic.go:334] "Generic (PLEG): container finished" podID="d0736884-b859-47c4-8c0e-99e40185f52a" containerID="58641cd1d8ac65a6ec865b673a3c8c686af5e477ebaa1ddb05dc2e93beb130cc" exitCode=0 Dec 08 10:19:17 crc kubenswrapper[4776]: I1208 10:19:17.879665 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brdh2" event={"ID":"d0736884-b859-47c4-8c0e-99e40185f52a","Type":"ContainerDied","Data":"58641cd1d8ac65a6ec865b673a3c8c686af5e477ebaa1ddb05dc2e93beb130cc"} Dec 08 10:19:17 crc kubenswrapper[4776]: I1208 10:19:17.879774 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78j2g" Dec 08 10:19:17 crc kubenswrapper[4776]: I1208 10:19:17.996958 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-78j2g"] Dec 08 10:19:18 crc kubenswrapper[4776]: I1208 10:19:18.010976 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-78j2g"] Dec 08 10:19:18 crc kubenswrapper[4776]: I1208 10:19:18.373591 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d3ff118-25c8-43d3-87ad-437fe201abf6" path="/var/lib/kubelet/pods/9d3ff118-25c8-43d3-87ad-437fe201abf6/volumes" Dec 08 10:19:18 crc kubenswrapper[4776]: I1208 10:19:18.895565 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brdh2" event={"ID":"d0736884-b859-47c4-8c0e-99e40185f52a","Type":"ContainerStarted","Data":"247a3cf900c769e6de328c57887f998f9d27d029001235839d6d672759a6f8d5"} Dec 08 10:19:18 crc kubenswrapper[4776]: I1208 10:19:18.992744 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-brdh2" podStartSLOduration=3.423000316 podStartE2EDuration="10.992235416s" podCreationTimestamp="2025-12-08 10:19:08 +0000 UTC" firstStartedPulling="2025-12-08 10:19:10.772539923 +0000 UTC m=+4827.035764945" lastFinishedPulling="2025-12-08 10:19:18.341775033 +0000 UTC m=+4834.605000045" observedRunningTime="2025-12-08 10:19:18.988534944 +0000 UTC m=+4835.251759986" watchObservedRunningTime="2025-12-08 10:19:18.992235416 +0000 UTC m=+4835.255460438" Dec 08 10:19:19 crc kubenswrapper[4776]: I1208 10:19:19.225752 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:19 crc kubenswrapper[4776]: I1208 10:19:19.225810 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:20 crc kubenswrapper[4776]: I1208 10:19:20.488497 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-brdh2" podUID="d0736884-b859-47c4-8c0e-99e40185f52a" containerName="registry-server" probeResult="failure" output=< Dec 08 10:19:20 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 10:19:20 crc kubenswrapper[4776]: > Dec 08 10:19:30 crc kubenswrapper[4776]: I1208 10:19:30.278382 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-brdh2" podUID="d0736884-b859-47c4-8c0e-99e40185f52a" containerName="registry-server" probeResult="failure" output=< Dec 08 10:19:30 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 10:19:30 crc kubenswrapper[4776]: > Dec 08 10:19:39 crc kubenswrapper[4776]: I1208 10:19:39.283946 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:39 crc kubenswrapper[4776]: I1208 10:19:39.353340 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:39 crc kubenswrapper[4776]: I1208 10:19:39.704470 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-brdh2"] Dec 08 10:19:41 crc kubenswrapper[4776]: I1208 10:19:41.132258 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-brdh2" podUID="d0736884-b859-47c4-8c0e-99e40185f52a" containerName="registry-server" containerID="cri-o://247a3cf900c769e6de328c57887f998f9d27d029001235839d6d672759a6f8d5" gracePeriod=2 Dec 08 10:19:41 crc kubenswrapper[4776]: I1208 10:19:41.399256 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:19:41 crc kubenswrapper[4776]: I1208 10:19:41.400600 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:19:41 crc kubenswrapper[4776]: I1208 10:19:41.400680 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 10:19:41 crc kubenswrapper[4776]: I1208 10:19:41.402053 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25f229d16585db3e47cf1694dd543abb708db87dbc5a4391839049281b460bc9"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 10:19:41 crc kubenswrapper[4776]: I1208 10:19:41.402158 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://25f229d16585db3e47cf1694dd543abb708db87dbc5a4391839049281b460bc9" gracePeriod=600 Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.016103 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.078082 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0736884-b859-47c4-8c0e-99e40185f52a-catalog-content\") pod \"d0736884-b859-47c4-8c0e-99e40185f52a\" (UID: \"d0736884-b859-47c4-8c0e-99e40185f52a\") " Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.078330 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0736884-b859-47c4-8c0e-99e40185f52a-utilities\") pod \"d0736884-b859-47c4-8c0e-99e40185f52a\" (UID: \"d0736884-b859-47c4-8c0e-99e40185f52a\") " Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.078460 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7tjz\" (UniqueName: \"kubernetes.io/projected/d0736884-b859-47c4-8c0e-99e40185f52a-kube-api-access-b7tjz\") pod \"d0736884-b859-47c4-8c0e-99e40185f52a\" (UID: \"d0736884-b859-47c4-8c0e-99e40185f52a\") " Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.081425 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0736884-b859-47c4-8c0e-99e40185f52a-utilities" (OuterVolumeSpecName: "utilities") pod "d0736884-b859-47c4-8c0e-99e40185f52a" (UID: "d0736884-b859-47c4-8c0e-99e40185f52a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.090973 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0736884-b859-47c4-8c0e-99e40185f52a-kube-api-access-b7tjz" (OuterVolumeSpecName: "kube-api-access-b7tjz") pod "d0736884-b859-47c4-8c0e-99e40185f52a" (UID: "d0736884-b859-47c4-8c0e-99e40185f52a"). InnerVolumeSpecName "kube-api-access-b7tjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.174999 4776 generic.go:334] "Generic (PLEG): container finished" podID="d0736884-b859-47c4-8c0e-99e40185f52a" containerID="247a3cf900c769e6de328c57887f998f9d27d029001235839d6d672759a6f8d5" exitCode=0 Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.175044 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brdh2" event={"ID":"d0736884-b859-47c4-8c0e-99e40185f52a","Type":"ContainerDied","Data":"247a3cf900c769e6de328c57887f998f9d27d029001235839d6d672759a6f8d5"} Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.175084 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brdh2" event={"ID":"d0736884-b859-47c4-8c0e-99e40185f52a","Type":"ContainerDied","Data":"dd52045e210a76e3161c7c2070b0d940f59436960d4c7c14137ff1e43766b1bb"} Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.175087 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-brdh2" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.175514 4776 scope.go:117] "RemoveContainer" containerID="247a3cf900c769e6de328c57887f998f9d27d029001235839d6d672759a6f8d5" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.179857 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="25f229d16585db3e47cf1694dd543abb708db87dbc5a4391839049281b460bc9" exitCode=0 Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.179892 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"25f229d16585db3e47cf1694dd543abb708db87dbc5a4391839049281b460bc9"} Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.179912 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0"} Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.188467 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0736884-b859-47c4-8c0e-99e40185f52a-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.188511 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7tjz\" (UniqueName: \"kubernetes.io/projected/d0736884-b859-47c4-8c0e-99e40185f52a-kube-api-access-b7tjz\") on node \"crc\" DevicePath \"\"" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.212866 4776 scope.go:117] "RemoveContainer" containerID="58641cd1d8ac65a6ec865b673a3c8c686af5e477ebaa1ddb05dc2e93beb130cc" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.245639 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0736884-b859-47c4-8c0e-99e40185f52a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0736884-b859-47c4-8c0e-99e40185f52a" (UID: "d0736884-b859-47c4-8c0e-99e40185f52a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.248996 4776 scope.go:117] "RemoveContainer" containerID="dcea67017c22e5266dc84df2ff930ee54538d9849aee2d49d683cae803cedc9e" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.290652 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0736884-b859-47c4-8c0e-99e40185f52a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.329400 4776 scope.go:117] "RemoveContainer" containerID="247a3cf900c769e6de328c57887f998f9d27d029001235839d6d672759a6f8d5" Dec 08 10:19:42 crc kubenswrapper[4776]: E1208 10:19:42.330867 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"247a3cf900c769e6de328c57887f998f9d27d029001235839d6d672759a6f8d5\": container with ID starting with 247a3cf900c769e6de328c57887f998f9d27d029001235839d6d672759a6f8d5 not found: ID does not exist" containerID="247a3cf900c769e6de328c57887f998f9d27d029001235839d6d672759a6f8d5" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.330919 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"247a3cf900c769e6de328c57887f998f9d27d029001235839d6d672759a6f8d5"} err="failed to get container status \"247a3cf900c769e6de328c57887f998f9d27d029001235839d6d672759a6f8d5\": rpc error: code = NotFound desc = could not find container \"247a3cf900c769e6de328c57887f998f9d27d029001235839d6d672759a6f8d5\": container with ID starting with 247a3cf900c769e6de328c57887f998f9d27d029001235839d6d672759a6f8d5 not found: ID does not exist" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.330951 4776 scope.go:117] "RemoveContainer" containerID="58641cd1d8ac65a6ec865b673a3c8c686af5e477ebaa1ddb05dc2e93beb130cc" Dec 08 10:19:42 crc kubenswrapper[4776]: E1208 10:19:42.331562 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58641cd1d8ac65a6ec865b673a3c8c686af5e477ebaa1ddb05dc2e93beb130cc\": container with ID starting with 58641cd1d8ac65a6ec865b673a3c8c686af5e477ebaa1ddb05dc2e93beb130cc not found: ID does not exist" containerID="58641cd1d8ac65a6ec865b673a3c8c686af5e477ebaa1ddb05dc2e93beb130cc" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.331601 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58641cd1d8ac65a6ec865b673a3c8c686af5e477ebaa1ddb05dc2e93beb130cc"} err="failed to get container status \"58641cd1d8ac65a6ec865b673a3c8c686af5e477ebaa1ddb05dc2e93beb130cc\": rpc error: code = NotFound desc = could not find container \"58641cd1d8ac65a6ec865b673a3c8c686af5e477ebaa1ddb05dc2e93beb130cc\": container with ID starting with 58641cd1d8ac65a6ec865b673a3c8c686af5e477ebaa1ddb05dc2e93beb130cc not found: ID does not exist" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.331623 4776 scope.go:117] "RemoveContainer" containerID="dcea67017c22e5266dc84df2ff930ee54538d9849aee2d49d683cae803cedc9e" Dec 08 10:19:42 crc kubenswrapper[4776]: E1208 10:19:42.332183 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcea67017c22e5266dc84df2ff930ee54538d9849aee2d49d683cae803cedc9e\": container with ID starting with dcea67017c22e5266dc84df2ff930ee54538d9849aee2d49d683cae803cedc9e not found: ID does not exist" containerID="dcea67017c22e5266dc84df2ff930ee54538d9849aee2d49d683cae803cedc9e" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.332229 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcea67017c22e5266dc84df2ff930ee54538d9849aee2d49d683cae803cedc9e"} err="failed to get container status \"dcea67017c22e5266dc84df2ff930ee54538d9849aee2d49d683cae803cedc9e\": rpc error: code = NotFound desc = could not find container \"dcea67017c22e5266dc84df2ff930ee54538d9849aee2d49d683cae803cedc9e\": container with ID starting with dcea67017c22e5266dc84df2ff930ee54538d9849aee2d49d683cae803cedc9e not found: ID does not exist" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.332276 4776 scope.go:117] "RemoveContainer" containerID="448474efd555c5bc5445de386b34183d50e7a6a8112936335b71be6c5bf4d95c" Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.507940 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-brdh2"] Dec 08 10:19:42 crc kubenswrapper[4776]: I1208 10:19:42.519215 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-brdh2"] Dec 08 10:19:44 crc kubenswrapper[4776]: I1208 10:19:44.359335 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0736884-b859-47c4-8c0e-99e40185f52a" path="/var/lib/kubelet/pods/d0736884-b859-47c4-8c0e-99e40185f52a/volumes" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.341191 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vzzx6"] Dec 08 10:20:30 crc kubenswrapper[4776]: E1208 10:20:30.343053 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d3ff118-25c8-43d3-87ad-437fe201abf6" containerName="extract-content" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.343081 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d3ff118-25c8-43d3-87ad-437fe201abf6" containerName="extract-content" Dec 08 10:20:30 crc kubenswrapper[4776]: E1208 10:20:30.343109 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0736884-b859-47c4-8c0e-99e40185f52a" containerName="extract-utilities" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.343119 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0736884-b859-47c4-8c0e-99e40185f52a" containerName="extract-utilities" Dec 08 10:20:30 crc kubenswrapper[4776]: E1208 10:20:30.343145 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d3ff118-25c8-43d3-87ad-437fe201abf6" containerName="extract-utilities" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.343157 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d3ff118-25c8-43d3-87ad-437fe201abf6" containerName="extract-utilities" Dec 08 10:20:30 crc kubenswrapper[4776]: E1208 10:20:30.343211 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0736884-b859-47c4-8c0e-99e40185f52a" containerName="registry-server" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.343222 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0736884-b859-47c4-8c0e-99e40185f52a" containerName="registry-server" Dec 08 10:20:30 crc kubenswrapper[4776]: E1208 10:20:30.343241 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0736884-b859-47c4-8c0e-99e40185f52a" containerName="extract-content" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.343249 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0736884-b859-47c4-8c0e-99e40185f52a" containerName="extract-content" Dec 08 10:20:30 crc kubenswrapper[4776]: E1208 10:20:30.343290 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d3ff118-25c8-43d3-87ad-437fe201abf6" containerName="registry-server" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.343297 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d3ff118-25c8-43d3-87ad-437fe201abf6" containerName="registry-server" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.343758 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d3ff118-25c8-43d3-87ad-437fe201abf6" containerName="registry-server" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.343805 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0736884-b859-47c4-8c0e-99e40185f52a" containerName="registry-server" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.346394 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.362951 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzzx6"] Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.439487 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4932e55-4ad1-4094-8a74-4562ce240c6a-utilities\") pod \"certified-operators-vzzx6\" (UID: \"f4932e55-4ad1-4094-8a74-4562ce240c6a\") " pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.439787 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgtl8\" (UniqueName: \"kubernetes.io/projected/f4932e55-4ad1-4094-8a74-4562ce240c6a-kube-api-access-jgtl8\") pod \"certified-operators-vzzx6\" (UID: \"f4932e55-4ad1-4094-8a74-4562ce240c6a\") " pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.439850 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4932e55-4ad1-4094-8a74-4562ce240c6a-catalog-content\") pod \"certified-operators-vzzx6\" (UID: \"f4932e55-4ad1-4094-8a74-4562ce240c6a\") " pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.542318 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4932e55-4ad1-4094-8a74-4562ce240c6a-utilities\") pod \"certified-operators-vzzx6\" (UID: \"f4932e55-4ad1-4094-8a74-4562ce240c6a\") " pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.542469 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgtl8\" (UniqueName: \"kubernetes.io/projected/f4932e55-4ad1-4094-8a74-4562ce240c6a-kube-api-access-jgtl8\") pod \"certified-operators-vzzx6\" (UID: \"f4932e55-4ad1-4094-8a74-4562ce240c6a\") " pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.542526 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4932e55-4ad1-4094-8a74-4562ce240c6a-catalog-content\") pod \"certified-operators-vzzx6\" (UID: \"f4932e55-4ad1-4094-8a74-4562ce240c6a\") " pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.543618 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4932e55-4ad1-4094-8a74-4562ce240c6a-catalog-content\") pod \"certified-operators-vzzx6\" (UID: \"f4932e55-4ad1-4094-8a74-4562ce240c6a\") " pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.543917 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4932e55-4ad1-4094-8a74-4562ce240c6a-utilities\") pod \"certified-operators-vzzx6\" (UID: \"f4932e55-4ad1-4094-8a74-4562ce240c6a\") " pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.564526 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgtl8\" (UniqueName: \"kubernetes.io/projected/f4932e55-4ad1-4094-8a74-4562ce240c6a-kube-api-access-jgtl8\") pod \"certified-operators-vzzx6\" (UID: \"f4932e55-4ad1-4094-8a74-4562ce240c6a\") " pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:30 crc kubenswrapper[4776]: I1208 10:20:30.671605 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:31 crc kubenswrapper[4776]: I1208 10:20:31.148866 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzzx6"] Dec 08 10:20:31 crc kubenswrapper[4776]: I1208 10:20:31.691089 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4932e55-4ad1-4094-8a74-4562ce240c6a" containerID="1097de8b6696e2f7c8dc2882ced9b9e47189f55d9c1acb26d90f6b05b7bc28d3" exitCode=0 Dec 08 10:20:31 crc kubenswrapper[4776]: I1208 10:20:31.691200 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzzx6" event={"ID":"f4932e55-4ad1-4094-8a74-4562ce240c6a","Type":"ContainerDied","Data":"1097de8b6696e2f7c8dc2882ced9b9e47189f55d9c1acb26d90f6b05b7bc28d3"} Dec 08 10:20:31 crc kubenswrapper[4776]: I1208 10:20:31.691485 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzzx6" event={"ID":"f4932e55-4ad1-4094-8a74-4562ce240c6a","Type":"ContainerStarted","Data":"1162415a7d9ce199d6abe0e57fadc8a8a81c5ee7ad3c9317542abaf083c7ad20"} Dec 08 10:20:32 crc kubenswrapper[4776]: I1208 10:20:32.702610 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzzx6" event={"ID":"f4932e55-4ad1-4094-8a74-4562ce240c6a","Type":"ContainerStarted","Data":"9cad1d3ee3f346932f0e328ec4acc1853143924157f8da5a02bcced2b9d485b8"} Dec 08 10:20:34 crc kubenswrapper[4776]: I1208 10:20:34.724142 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4932e55-4ad1-4094-8a74-4562ce240c6a" containerID="9cad1d3ee3f346932f0e328ec4acc1853143924157f8da5a02bcced2b9d485b8" exitCode=0 Dec 08 10:20:34 crc kubenswrapper[4776]: I1208 10:20:34.724215 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzzx6" event={"ID":"f4932e55-4ad1-4094-8a74-4562ce240c6a","Type":"ContainerDied","Data":"9cad1d3ee3f346932f0e328ec4acc1853143924157f8da5a02bcced2b9d485b8"} Dec 08 10:20:35 crc kubenswrapper[4776]: I1208 10:20:35.739028 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzzx6" event={"ID":"f4932e55-4ad1-4094-8a74-4562ce240c6a","Type":"ContainerStarted","Data":"c097092ef30126ca0a648ad5d3053668e0551ba25e954448f00038acce4ca04d"} Dec 08 10:20:35 crc kubenswrapper[4776]: I1208 10:20:35.765816 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vzzx6" podStartSLOduration=2.347111441 podStartE2EDuration="5.765797506s" podCreationTimestamp="2025-12-08 10:20:30 +0000 UTC" firstStartedPulling="2025-12-08 10:20:31.693750842 +0000 UTC m=+4907.956975864" lastFinishedPulling="2025-12-08 10:20:35.112436907 +0000 UTC m=+4911.375661929" observedRunningTime="2025-12-08 10:20:35.760047978 +0000 UTC m=+4912.023273030" watchObservedRunningTime="2025-12-08 10:20:35.765797506 +0000 UTC m=+4912.029022528" Dec 08 10:20:40 crc kubenswrapper[4776]: I1208 10:20:40.672673 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:40 crc kubenswrapper[4776]: I1208 10:20:40.673219 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:40 crc kubenswrapper[4776]: I1208 10:20:40.756406 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:40 crc kubenswrapper[4776]: I1208 10:20:40.846316 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:40 crc kubenswrapper[4776]: I1208 10:20:40.993674 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzzx6"] Dec 08 10:20:42 crc kubenswrapper[4776]: I1208 10:20:42.811828 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vzzx6" podUID="f4932e55-4ad1-4094-8a74-4562ce240c6a" containerName="registry-server" containerID="cri-o://c097092ef30126ca0a648ad5d3053668e0551ba25e954448f00038acce4ca04d" gracePeriod=2 Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.445014 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.547449 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgtl8\" (UniqueName: \"kubernetes.io/projected/f4932e55-4ad1-4094-8a74-4562ce240c6a-kube-api-access-jgtl8\") pod \"f4932e55-4ad1-4094-8a74-4562ce240c6a\" (UID: \"f4932e55-4ad1-4094-8a74-4562ce240c6a\") " Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.547680 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4932e55-4ad1-4094-8a74-4562ce240c6a-utilities\") pod \"f4932e55-4ad1-4094-8a74-4562ce240c6a\" (UID: \"f4932e55-4ad1-4094-8a74-4562ce240c6a\") " Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.547936 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4932e55-4ad1-4094-8a74-4562ce240c6a-catalog-content\") pod \"f4932e55-4ad1-4094-8a74-4562ce240c6a\" (UID: \"f4932e55-4ad1-4094-8a74-4562ce240c6a\") " Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.548797 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4932e55-4ad1-4094-8a74-4562ce240c6a-utilities" (OuterVolumeSpecName: "utilities") pod "f4932e55-4ad1-4094-8a74-4562ce240c6a" (UID: "f4932e55-4ad1-4094-8a74-4562ce240c6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.555510 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4932e55-4ad1-4094-8a74-4562ce240c6a-kube-api-access-jgtl8" (OuterVolumeSpecName: "kube-api-access-jgtl8") pod "f4932e55-4ad1-4094-8a74-4562ce240c6a" (UID: "f4932e55-4ad1-4094-8a74-4562ce240c6a"). InnerVolumeSpecName "kube-api-access-jgtl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.597899 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4932e55-4ad1-4094-8a74-4562ce240c6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4932e55-4ad1-4094-8a74-4562ce240c6a" (UID: "f4932e55-4ad1-4094-8a74-4562ce240c6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.650857 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgtl8\" (UniqueName: \"kubernetes.io/projected/f4932e55-4ad1-4094-8a74-4562ce240c6a-kube-api-access-jgtl8\") on node \"crc\" DevicePath \"\"" Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.650894 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4932e55-4ad1-4094-8a74-4562ce240c6a-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.650905 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4932e55-4ad1-4094-8a74-4562ce240c6a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.824237 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4932e55-4ad1-4094-8a74-4562ce240c6a" containerID="c097092ef30126ca0a648ad5d3053668e0551ba25e954448f00038acce4ca04d" exitCode=0 Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.824368 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzzx6" Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.824404 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzzx6" event={"ID":"f4932e55-4ad1-4094-8a74-4562ce240c6a","Type":"ContainerDied","Data":"c097092ef30126ca0a648ad5d3053668e0551ba25e954448f00038acce4ca04d"} Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.825256 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzzx6" event={"ID":"f4932e55-4ad1-4094-8a74-4562ce240c6a","Type":"ContainerDied","Data":"1162415a7d9ce199d6abe0e57fadc8a8a81c5ee7ad3c9317542abaf083c7ad20"} Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.825293 4776 scope.go:117] "RemoveContainer" containerID="c097092ef30126ca0a648ad5d3053668e0551ba25e954448f00038acce4ca04d" Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.859761 4776 scope.go:117] "RemoveContainer" containerID="9cad1d3ee3f346932f0e328ec4acc1853143924157f8da5a02bcced2b9d485b8" Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.864246 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzzx6"] Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.874969 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vzzx6"] Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.902632 4776 scope.go:117] "RemoveContainer" containerID="1097de8b6696e2f7c8dc2882ced9b9e47189f55d9c1acb26d90f6b05b7bc28d3" Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.936104 4776 scope.go:117] "RemoveContainer" containerID="c097092ef30126ca0a648ad5d3053668e0551ba25e954448f00038acce4ca04d" Dec 08 10:20:43 crc kubenswrapper[4776]: E1208 10:20:43.936676 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c097092ef30126ca0a648ad5d3053668e0551ba25e954448f00038acce4ca04d\": container with ID starting with c097092ef30126ca0a648ad5d3053668e0551ba25e954448f00038acce4ca04d not found: ID does not exist" containerID="c097092ef30126ca0a648ad5d3053668e0551ba25e954448f00038acce4ca04d" Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.936723 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c097092ef30126ca0a648ad5d3053668e0551ba25e954448f00038acce4ca04d"} err="failed to get container status \"c097092ef30126ca0a648ad5d3053668e0551ba25e954448f00038acce4ca04d\": rpc error: code = NotFound desc = could not find container \"c097092ef30126ca0a648ad5d3053668e0551ba25e954448f00038acce4ca04d\": container with ID starting with c097092ef30126ca0a648ad5d3053668e0551ba25e954448f00038acce4ca04d not found: ID does not exist" Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.936756 4776 scope.go:117] "RemoveContainer" containerID="9cad1d3ee3f346932f0e328ec4acc1853143924157f8da5a02bcced2b9d485b8" Dec 08 10:20:43 crc kubenswrapper[4776]: E1208 10:20:43.937207 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cad1d3ee3f346932f0e328ec4acc1853143924157f8da5a02bcced2b9d485b8\": container with ID starting with 9cad1d3ee3f346932f0e328ec4acc1853143924157f8da5a02bcced2b9d485b8 not found: ID does not exist" containerID="9cad1d3ee3f346932f0e328ec4acc1853143924157f8da5a02bcced2b9d485b8" Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.937242 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cad1d3ee3f346932f0e328ec4acc1853143924157f8da5a02bcced2b9d485b8"} err="failed to get container status \"9cad1d3ee3f346932f0e328ec4acc1853143924157f8da5a02bcced2b9d485b8\": rpc error: code = NotFound desc = could not find container \"9cad1d3ee3f346932f0e328ec4acc1853143924157f8da5a02bcced2b9d485b8\": container with ID starting with 9cad1d3ee3f346932f0e328ec4acc1853143924157f8da5a02bcced2b9d485b8 not found: ID does not exist" Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.937268 4776 scope.go:117] "RemoveContainer" containerID="1097de8b6696e2f7c8dc2882ced9b9e47189f55d9c1acb26d90f6b05b7bc28d3" Dec 08 10:20:43 crc kubenswrapper[4776]: E1208 10:20:43.937721 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1097de8b6696e2f7c8dc2882ced9b9e47189f55d9c1acb26d90f6b05b7bc28d3\": container with ID starting with 1097de8b6696e2f7c8dc2882ced9b9e47189f55d9c1acb26d90f6b05b7bc28d3 not found: ID does not exist" containerID="1097de8b6696e2f7c8dc2882ced9b9e47189f55d9c1acb26d90f6b05b7bc28d3" Dec 08 10:20:43 crc kubenswrapper[4776]: I1208 10:20:43.937754 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1097de8b6696e2f7c8dc2882ced9b9e47189f55d9c1acb26d90f6b05b7bc28d3"} err="failed to get container status \"1097de8b6696e2f7c8dc2882ced9b9e47189f55d9c1acb26d90f6b05b7bc28d3\": rpc error: code = NotFound desc = could not find container \"1097de8b6696e2f7c8dc2882ced9b9e47189f55d9c1acb26d90f6b05b7bc28d3\": container with ID starting with 1097de8b6696e2f7c8dc2882ced9b9e47189f55d9c1acb26d90f6b05b7bc28d3 not found: ID does not exist" Dec 08 10:20:44 crc kubenswrapper[4776]: I1208 10:20:44.359648 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4932e55-4ad1-4094-8a74-4562ce240c6a" path="/var/lib/kubelet/pods/f4932e55-4ad1-4094-8a74-4562ce240c6a/volumes" Dec 08 10:21:41 crc kubenswrapper[4776]: I1208 10:21:41.398725 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:21:41 crc kubenswrapper[4776]: I1208 10:21:41.399363 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:22:11 crc kubenswrapper[4776]: I1208 10:22:11.398802 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:22:11 crc kubenswrapper[4776]: I1208 10:22:11.399425 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:22:41 crc kubenswrapper[4776]: I1208 10:22:41.399456 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:22:41 crc kubenswrapper[4776]: I1208 10:22:41.399968 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:22:41 crc kubenswrapper[4776]: I1208 10:22:41.400013 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 10:22:41 crc kubenswrapper[4776]: I1208 10:22:41.400891 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 10:22:41 crc kubenswrapper[4776]: I1208 10:22:41.400956 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" gracePeriod=600 Dec 08 10:22:41 crc kubenswrapper[4776]: E1208 10:22:41.524688 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:22:42 crc kubenswrapper[4776]: I1208 10:22:42.130943 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" exitCode=0 Dec 08 10:22:42 crc kubenswrapper[4776]: I1208 10:22:42.131292 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0"} Dec 08 10:22:42 crc kubenswrapper[4776]: I1208 10:22:42.131470 4776 scope.go:117] "RemoveContainer" containerID="25f229d16585db3e47cf1694dd543abb708db87dbc5a4391839049281b460bc9" Dec 08 10:22:42 crc kubenswrapper[4776]: I1208 10:22:42.132257 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:22:42 crc kubenswrapper[4776]: E1208 10:22:42.132563 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:22:55 crc kubenswrapper[4776]: I1208 10:22:55.344160 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:22:55 crc kubenswrapper[4776]: E1208 10:22:55.344999 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:23:09 crc kubenswrapper[4776]: I1208 10:23:09.344381 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:23:09 crc kubenswrapper[4776]: E1208 10:23:09.345162 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:23:18 crc kubenswrapper[4776]: I1208 10:23:18.752899 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="425d947a-2a85-4a03-853f-a60f54515a57" containerName="galera" probeResult="failure" output="command timed out" Dec 08 10:23:20 crc kubenswrapper[4776]: I1208 10:23:20.344905 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:23:20 crc kubenswrapper[4776]: E1208 10:23:20.345744 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:23:32 crc kubenswrapper[4776]: I1208 10:23:32.343659 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:23:32 crc kubenswrapper[4776]: E1208 10:23:32.344341 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:23:46 crc kubenswrapper[4776]: I1208 10:23:46.343824 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:23:46 crc kubenswrapper[4776]: E1208 10:23:46.344588 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:23:57 crc kubenswrapper[4776]: I1208 10:23:57.343975 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:23:57 crc kubenswrapper[4776]: E1208 10:23:57.345353 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:24:09 crc kubenswrapper[4776]: I1208 10:24:09.344099 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:24:09 crc kubenswrapper[4776]: E1208 10:24:09.344941 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:24:20 crc kubenswrapper[4776]: I1208 10:24:20.344445 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:24:20 crc kubenswrapper[4776]: E1208 10:24:20.345307 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:24:32 crc kubenswrapper[4776]: I1208 10:24:32.344487 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:24:32 crc kubenswrapper[4776]: E1208 10:24:32.345430 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:24:47 crc kubenswrapper[4776]: I1208 10:24:47.344224 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:24:47 crc kubenswrapper[4776]: E1208 10:24:47.345886 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:25:01 crc kubenswrapper[4776]: I1208 10:25:01.343436 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:25:01 crc kubenswrapper[4776]: E1208 10:25:01.344292 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:25:15 crc kubenswrapper[4776]: I1208 10:25:15.343949 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:25:15 crc kubenswrapper[4776]: E1208 10:25:15.344761 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:25:19 crc kubenswrapper[4776]: I1208 10:25:19.779694 4776 scope.go:117] "RemoveContainer" containerID="c865ae1eb668a3fc5ac9d932d7ded47921b774d5aa69d9101c86420cba2ee678" Dec 08 10:25:19 crc kubenswrapper[4776]: I1208 10:25:19.911761 4776 scope.go:117] "RemoveContainer" containerID="a5d34d183f589076615ff79744c5e7dcae62951f7757714a90b04b1fe84cf609" Dec 08 10:25:19 crc kubenswrapper[4776]: I1208 10:25:19.981268 4776 scope.go:117] "RemoveContainer" containerID="d0327c9a598797c595199e68afe777aba529989ad81fbf74f53a1d0d1835a915" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.144952 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bpx8r"] Dec 08 10:25:27 crc kubenswrapper[4776]: E1208 10:25:27.145955 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4932e55-4ad1-4094-8a74-4562ce240c6a" containerName="extract-content" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.145970 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4932e55-4ad1-4094-8a74-4562ce240c6a" containerName="extract-content" Dec 08 10:25:27 crc kubenswrapper[4776]: E1208 10:25:27.145990 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4932e55-4ad1-4094-8a74-4562ce240c6a" containerName="registry-server" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.145998 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4932e55-4ad1-4094-8a74-4562ce240c6a" containerName="registry-server" Dec 08 10:25:27 crc kubenswrapper[4776]: E1208 10:25:27.146020 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4932e55-4ad1-4094-8a74-4562ce240c6a" containerName="extract-utilities" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.146027 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4932e55-4ad1-4094-8a74-4562ce240c6a" containerName="extract-utilities" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.146276 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4932e55-4ad1-4094-8a74-4562ce240c6a" containerName="registry-server" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.148289 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.156754 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bpx8r"] Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.220976 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ee06227-2151-4c20-b709-9d661222cf64-catalog-content\") pod \"redhat-marketplace-bpx8r\" (UID: \"1ee06227-2151-4c20-b709-9d661222cf64\") " pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.221112 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzgl6\" (UniqueName: \"kubernetes.io/projected/1ee06227-2151-4c20-b709-9d661222cf64-kube-api-access-jzgl6\") pod \"redhat-marketplace-bpx8r\" (UID: \"1ee06227-2151-4c20-b709-9d661222cf64\") " pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.221234 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ee06227-2151-4c20-b709-9d661222cf64-utilities\") pod \"redhat-marketplace-bpx8r\" (UID: \"1ee06227-2151-4c20-b709-9d661222cf64\") " pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.324452 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzgl6\" (UniqueName: \"kubernetes.io/projected/1ee06227-2151-4c20-b709-9d661222cf64-kube-api-access-jzgl6\") pod \"redhat-marketplace-bpx8r\" (UID: \"1ee06227-2151-4c20-b709-9d661222cf64\") " pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.324603 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ee06227-2151-4c20-b709-9d661222cf64-utilities\") pod \"redhat-marketplace-bpx8r\" (UID: \"1ee06227-2151-4c20-b709-9d661222cf64\") " pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.324723 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ee06227-2151-4c20-b709-9d661222cf64-catalog-content\") pod \"redhat-marketplace-bpx8r\" (UID: \"1ee06227-2151-4c20-b709-9d661222cf64\") " pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.325486 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ee06227-2151-4c20-b709-9d661222cf64-utilities\") pod \"redhat-marketplace-bpx8r\" (UID: \"1ee06227-2151-4c20-b709-9d661222cf64\") " pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.325514 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ee06227-2151-4c20-b709-9d661222cf64-catalog-content\") pod \"redhat-marketplace-bpx8r\" (UID: \"1ee06227-2151-4c20-b709-9d661222cf64\") " pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.343999 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzgl6\" (UniqueName: \"kubernetes.io/projected/1ee06227-2151-4c20-b709-9d661222cf64-kube-api-access-jzgl6\") pod \"redhat-marketplace-bpx8r\" (UID: \"1ee06227-2151-4c20-b709-9d661222cf64\") " pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.346947 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:25:27 crc kubenswrapper[4776]: E1208 10:25:27.347315 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:25:27 crc kubenswrapper[4776]: I1208 10:25:27.473739 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:28 crc kubenswrapper[4776]: I1208 10:25:27.999682 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bpx8r"] Dec 08 10:25:28 crc kubenswrapper[4776]: I1208 10:25:28.815710 4776 generic.go:334] "Generic (PLEG): container finished" podID="1ee06227-2151-4c20-b709-9d661222cf64" containerID="9165780ab2172eaa0a687e055f8ca0136ca0d1fbe057afd5f1e124ee045b54e1" exitCode=0 Dec 08 10:25:28 crc kubenswrapper[4776]: I1208 10:25:28.815774 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bpx8r" event={"ID":"1ee06227-2151-4c20-b709-9d661222cf64","Type":"ContainerDied","Data":"9165780ab2172eaa0a687e055f8ca0136ca0d1fbe057afd5f1e124ee045b54e1"} Dec 08 10:25:28 crc kubenswrapper[4776]: I1208 10:25:28.816051 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bpx8r" event={"ID":"1ee06227-2151-4c20-b709-9d661222cf64","Type":"ContainerStarted","Data":"ba7fe2f98ed41f1ee69927a7b3465dbfea0f9e1a229f31e9a7050ba9a50b3c67"} Dec 08 10:25:28 crc kubenswrapper[4776]: I1208 10:25:28.818784 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 10:25:29 crc kubenswrapper[4776]: I1208 10:25:29.826154 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bpx8r" event={"ID":"1ee06227-2151-4c20-b709-9d661222cf64","Type":"ContainerStarted","Data":"09e6d41555530b697fdf0645dc4688c6502bffee99c1c732f1293fb6dc8c2806"} Dec 08 10:25:30 crc kubenswrapper[4776]: E1208 10:25:30.518010 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ee06227_2151_4c20_b709_9d661222cf64.slice/crio-conmon-09e6d41555530b697fdf0645dc4688c6502bffee99c1c732f1293fb6dc8c2806.scope\": RecentStats: unable to find data in memory cache]" Dec 08 10:25:30 crc kubenswrapper[4776]: I1208 10:25:30.839739 4776 generic.go:334] "Generic (PLEG): container finished" podID="1ee06227-2151-4c20-b709-9d661222cf64" containerID="09e6d41555530b697fdf0645dc4688c6502bffee99c1c732f1293fb6dc8c2806" exitCode=0 Dec 08 10:25:30 crc kubenswrapper[4776]: I1208 10:25:30.839805 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bpx8r" event={"ID":"1ee06227-2151-4c20-b709-9d661222cf64","Type":"ContainerDied","Data":"09e6d41555530b697fdf0645dc4688c6502bffee99c1c732f1293fb6dc8c2806"} Dec 08 10:25:31 crc kubenswrapper[4776]: I1208 10:25:31.856237 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bpx8r" event={"ID":"1ee06227-2151-4c20-b709-9d661222cf64","Type":"ContainerStarted","Data":"79535774c54dc8d3ea7e620587e64e9e7a10cd2fabd628abfd8eb2e2970594f9"} Dec 08 10:25:31 crc kubenswrapper[4776]: I1208 10:25:31.878593 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bpx8r" podStartSLOduration=2.445861012 podStartE2EDuration="4.878575661s" podCreationTimestamp="2025-12-08 10:25:27 +0000 UTC" firstStartedPulling="2025-12-08 10:25:28.818344512 +0000 UTC m=+5205.081569534" lastFinishedPulling="2025-12-08 10:25:31.251059161 +0000 UTC m=+5207.514284183" observedRunningTime="2025-12-08 10:25:31.871142848 +0000 UTC m=+5208.134367870" watchObservedRunningTime="2025-12-08 10:25:31.878575661 +0000 UTC m=+5208.141800683" Dec 08 10:25:37 crc kubenswrapper[4776]: I1208 10:25:37.474273 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:37 crc kubenswrapper[4776]: I1208 10:25:37.474829 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:37 crc kubenswrapper[4776]: I1208 10:25:37.540940 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:37 crc kubenswrapper[4776]: I1208 10:25:37.985573 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:38 crc kubenswrapper[4776]: I1208 10:25:38.344145 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:25:38 crc kubenswrapper[4776]: E1208 10:25:38.344599 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.095851 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bpx8r"] Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.096646 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bpx8r" podUID="1ee06227-2151-4c20-b709-9d661222cf64" containerName="registry-server" containerID="cri-o://79535774c54dc8d3ea7e620587e64e9e7a10cd2fabd628abfd8eb2e2970594f9" gracePeriod=2 Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.723882 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.782505 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzgl6\" (UniqueName: \"kubernetes.io/projected/1ee06227-2151-4c20-b709-9d661222cf64-kube-api-access-jzgl6\") pod \"1ee06227-2151-4c20-b709-9d661222cf64\" (UID: \"1ee06227-2151-4c20-b709-9d661222cf64\") " Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.782870 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ee06227-2151-4c20-b709-9d661222cf64-utilities\") pod \"1ee06227-2151-4c20-b709-9d661222cf64\" (UID: \"1ee06227-2151-4c20-b709-9d661222cf64\") " Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.783160 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ee06227-2151-4c20-b709-9d661222cf64-catalog-content\") pod \"1ee06227-2151-4c20-b709-9d661222cf64\" (UID: \"1ee06227-2151-4c20-b709-9d661222cf64\") " Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.783619 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ee06227-2151-4c20-b709-9d661222cf64-utilities" (OuterVolumeSpecName: "utilities") pod "1ee06227-2151-4c20-b709-9d661222cf64" (UID: "1ee06227-2151-4c20-b709-9d661222cf64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.784876 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ee06227-2151-4c20-b709-9d661222cf64-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.793886 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee06227-2151-4c20-b709-9d661222cf64-kube-api-access-jzgl6" (OuterVolumeSpecName: "kube-api-access-jzgl6") pod "1ee06227-2151-4c20-b709-9d661222cf64" (UID: "1ee06227-2151-4c20-b709-9d661222cf64"). InnerVolumeSpecName "kube-api-access-jzgl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.805863 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ee06227-2151-4c20-b709-9d661222cf64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ee06227-2151-4c20-b709-9d661222cf64" (UID: "1ee06227-2151-4c20-b709-9d661222cf64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.886698 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ee06227-2151-4c20-b709-9d661222cf64-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.886734 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzgl6\" (UniqueName: \"kubernetes.io/projected/1ee06227-2151-4c20-b709-9d661222cf64-kube-api-access-jzgl6\") on node \"crc\" DevicePath \"\"" Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.976840 4776 generic.go:334] "Generic (PLEG): container finished" podID="1ee06227-2151-4c20-b709-9d661222cf64" containerID="79535774c54dc8d3ea7e620587e64e9e7a10cd2fabd628abfd8eb2e2970594f9" exitCode=0 Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.976893 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bpx8r" event={"ID":"1ee06227-2151-4c20-b709-9d661222cf64","Type":"ContainerDied","Data":"79535774c54dc8d3ea7e620587e64e9e7a10cd2fabd628abfd8eb2e2970594f9"} Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.976941 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bpx8r" event={"ID":"1ee06227-2151-4c20-b709-9d661222cf64","Type":"ContainerDied","Data":"ba7fe2f98ed41f1ee69927a7b3465dbfea0f9e1a229f31e9a7050ba9a50b3c67"} Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.976968 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bpx8r" Dec 08 10:25:41 crc kubenswrapper[4776]: I1208 10:25:41.976980 4776 scope.go:117] "RemoveContainer" containerID="79535774c54dc8d3ea7e620587e64e9e7a10cd2fabd628abfd8eb2e2970594f9" Dec 08 10:25:42 crc kubenswrapper[4776]: I1208 10:25:42.002159 4776 scope.go:117] "RemoveContainer" containerID="09e6d41555530b697fdf0645dc4688c6502bffee99c1c732f1293fb6dc8c2806" Dec 08 10:25:42 crc kubenswrapper[4776]: I1208 10:25:42.016701 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bpx8r"] Dec 08 10:25:42 crc kubenswrapper[4776]: I1208 10:25:42.028977 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bpx8r"] Dec 08 10:25:42 crc kubenswrapper[4776]: I1208 10:25:42.036220 4776 scope.go:117] "RemoveContainer" containerID="9165780ab2172eaa0a687e055f8ca0136ca0d1fbe057afd5f1e124ee045b54e1" Dec 08 10:25:42 crc kubenswrapper[4776]: I1208 10:25:42.084453 4776 scope.go:117] "RemoveContainer" containerID="79535774c54dc8d3ea7e620587e64e9e7a10cd2fabd628abfd8eb2e2970594f9" Dec 08 10:25:42 crc kubenswrapper[4776]: E1208 10:25:42.084944 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79535774c54dc8d3ea7e620587e64e9e7a10cd2fabd628abfd8eb2e2970594f9\": container with ID starting with 79535774c54dc8d3ea7e620587e64e9e7a10cd2fabd628abfd8eb2e2970594f9 not found: ID does not exist" containerID="79535774c54dc8d3ea7e620587e64e9e7a10cd2fabd628abfd8eb2e2970594f9" Dec 08 10:25:42 crc kubenswrapper[4776]: I1208 10:25:42.085041 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79535774c54dc8d3ea7e620587e64e9e7a10cd2fabd628abfd8eb2e2970594f9"} err="failed to get container status \"79535774c54dc8d3ea7e620587e64e9e7a10cd2fabd628abfd8eb2e2970594f9\": rpc error: code = NotFound desc = could not find container \"79535774c54dc8d3ea7e620587e64e9e7a10cd2fabd628abfd8eb2e2970594f9\": container with ID starting with 79535774c54dc8d3ea7e620587e64e9e7a10cd2fabd628abfd8eb2e2970594f9 not found: ID does not exist" Dec 08 10:25:42 crc kubenswrapper[4776]: I1208 10:25:42.085237 4776 scope.go:117] "RemoveContainer" containerID="09e6d41555530b697fdf0645dc4688c6502bffee99c1c732f1293fb6dc8c2806" Dec 08 10:25:42 crc kubenswrapper[4776]: E1208 10:25:42.085796 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09e6d41555530b697fdf0645dc4688c6502bffee99c1c732f1293fb6dc8c2806\": container with ID starting with 09e6d41555530b697fdf0645dc4688c6502bffee99c1c732f1293fb6dc8c2806 not found: ID does not exist" containerID="09e6d41555530b697fdf0645dc4688c6502bffee99c1c732f1293fb6dc8c2806" Dec 08 10:25:42 crc kubenswrapper[4776]: I1208 10:25:42.085849 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e6d41555530b697fdf0645dc4688c6502bffee99c1c732f1293fb6dc8c2806"} err="failed to get container status \"09e6d41555530b697fdf0645dc4688c6502bffee99c1c732f1293fb6dc8c2806\": rpc error: code = NotFound desc = could not find container \"09e6d41555530b697fdf0645dc4688c6502bffee99c1c732f1293fb6dc8c2806\": container with ID starting with 09e6d41555530b697fdf0645dc4688c6502bffee99c1c732f1293fb6dc8c2806 not found: ID does not exist" Dec 08 10:25:42 crc kubenswrapper[4776]: I1208 10:25:42.085870 4776 scope.go:117] "RemoveContainer" containerID="9165780ab2172eaa0a687e055f8ca0136ca0d1fbe057afd5f1e124ee045b54e1" Dec 08 10:25:42 crc kubenswrapper[4776]: E1208 10:25:42.086098 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9165780ab2172eaa0a687e055f8ca0136ca0d1fbe057afd5f1e124ee045b54e1\": container with ID starting with 9165780ab2172eaa0a687e055f8ca0136ca0d1fbe057afd5f1e124ee045b54e1 not found: ID does not exist" containerID="9165780ab2172eaa0a687e055f8ca0136ca0d1fbe057afd5f1e124ee045b54e1" Dec 08 10:25:42 crc kubenswrapper[4776]: I1208 10:25:42.086120 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9165780ab2172eaa0a687e055f8ca0136ca0d1fbe057afd5f1e124ee045b54e1"} err="failed to get container status \"9165780ab2172eaa0a687e055f8ca0136ca0d1fbe057afd5f1e124ee045b54e1\": rpc error: code = NotFound desc = could not find container \"9165780ab2172eaa0a687e055f8ca0136ca0d1fbe057afd5f1e124ee045b54e1\": container with ID starting with 9165780ab2172eaa0a687e055f8ca0136ca0d1fbe057afd5f1e124ee045b54e1 not found: ID does not exist" Dec 08 10:25:42 crc kubenswrapper[4776]: I1208 10:25:42.355376 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ee06227-2151-4c20-b709-9d661222cf64" path="/var/lib/kubelet/pods/1ee06227-2151-4c20-b709-9d661222cf64/volumes" Dec 08 10:25:53 crc kubenswrapper[4776]: I1208 10:25:53.344533 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:25:53 crc kubenswrapper[4776]: E1208 10:25:53.345602 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:26:07 crc kubenswrapper[4776]: I1208 10:26:07.344821 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:26:07 crc kubenswrapper[4776]: E1208 10:26:07.345935 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:26:19 crc kubenswrapper[4776]: I1208 10:26:19.344697 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:26:19 crc kubenswrapper[4776]: E1208 10:26:19.345458 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:26:32 crc kubenswrapper[4776]: I1208 10:26:32.343646 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:26:32 crc kubenswrapper[4776]: E1208 10:26:32.344641 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:26:45 crc kubenswrapper[4776]: I1208 10:26:45.344133 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:26:45 crc kubenswrapper[4776]: E1208 10:26:45.344882 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:26:58 crc kubenswrapper[4776]: I1208 10:26:58.344582 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:26:58 crc kubenswrapper[4776]: E1208 10:26:58.345366 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:27:13 crc kubenswrapper[4776]: I1208 10:27:13.343958 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:27:13 crc kubenswrapper[4776]: E1208 10:27:13.344963 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:27:25 crc kubenswrapper[4776]: I1208 10:27:25.343658 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:27:25 crc kubenswrapper[4776]: E1208 10:27:25.344335 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:27:38 crc kubenswrapper[4776]: I1208 10:27:38.344045 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:27:38 crc kubenswrapper[4776]: E1208 10:27:38.344879 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:27:51 crc kubenswrapper[4776]: I1208 10:27:51.344304 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:27:53 crc kubenswrapper[4776]: I1208 10:27:53.162342 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"64895e0778b6450b6c551f116e434d2513cfc42f72a2ea700de1a1fdd4e6f67b"} Dec 08 10:29:26 crc kubenswrapper[4776]: I1208 10:29:26.792554 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h7zl5"] Dec 08 10:29:26 crc kubenswrapper[4776]: E1208 10:29:26.795143 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee06227-2151-4c20-b709-9d661222cf64" containerName="registry-server" Dec 08 10:29:26 crc kubenswrapper[4776]: I1208 10:29:26.795162 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee06227-2151-4c20-b709-9d661222cf64" containerName="registry-server" Dec 08 10:29:26 crc kubenswrapper[4776]: E1208 10:29:26.795195 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee06227-2151-4c20-b709-9d661222cf64" containerName="extract-content" Dec 08 10:29:26 crc kubenswrapper[4776]: I1208 10:29:26.795203 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee06227-2151-4c20-b709-9d661222cf64" containerName="extract-content" Dec 08 10:29:26 crc kubenswrapper[4776]: E1208 10:29:26.795269 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee06227-2151-4c20-b709-9d661222cf64" containerName="extract-utilities" Dec 08 10:29:26 crc kubenswrapper[4776]: I1208 10:29:26.795277 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee06227-2151-4c20-b709-9d661222cf64" containerName="extract-utilities" Dec 08 10:29:26 crc kubenswrapper[4776]: I1208 10:29:26.795598 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee06227-2151-4c20-b709-9d661222cf64" containerName="registry-server" Dec 08 10:29:26 crc kubenswrapper[4776]: I1208 10:29:26.797738 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:26 crc kubenswrapper[4776]: I1208 10:29:26.806642 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h7zl5"] Dec 08 10:29:26 crc kubenswrapper[4776]: I1208 10:29:26.937241 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b833137-7802-4857-a80e-4700ae7bb897-catalog-content\") pod \"community-operators-h7zl5\" (UID: \"6b833137-7802-4857-a80e-4700ae7bb897\") " pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:26 crc kubenswrapper[4776]: I1208 10:29:26.939497 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9m4w\" (UniqueName: \"kubernetes.io/projected/6b833137-7802-4857-a80e-4700ae7bb897-kube-api-access-c9m4w\") pod \"community-operators-h7zl5\" (UID: \"6b833137-7802-4857-a80e-4700ae7bb897\") " pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:26 crc kubenswrapper[4776]: I1208 10:29:26.939564 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b833137-7802-4857-a80e-4700ae7bb897-utilities\") pod \"community-operators-h7zl5\" (UID: \"6b833137-7802-4857-a80e-4700ae7bb897\") " pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:27 crc kubenswrapper[4776]: I1208 10:29:27.042053 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b833137-7802-4857-a80e-4700ae7bb897-catalog-content\") pod \"community-operators-h7zl5\" (UID: \"6b833137-7802-4857-a80e-4700ae7bb897\") " pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:27 crc kubenswrapper[4776]: I1208 10:29:27.042185 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9m4w\" (UniqueName: \"kubernetes.io/projected/6b833137-7802-4857-a80e-4700ae7bb897-kube-api-access-c9m4w\") pod \"community-operators-h7zl5\" (UID: \"6b833137-7802-4857-a80e-4700ae7bb897\") " pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:27 crc kubenswrapper[4776]: I1208 10:29:27.042212 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b833137-7802-4857-a80e-4700ae7bb897-utilities\") pod \"community-operators-h7zl5\" (UID: \"6b833137-7802-4857-a80e-4700ae7bb897\") " pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:27 crc kubenswrapper[4776]: I1208 10:29:27.042766 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b833137-7802-4857-a80e-4700ae7bb897-catalog-content\") pod \"community-operators-h7zl5\" (UID: \"6b833137-7802-4857-a80e-4700ae7bb897\") " pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:27 crc kubenswrapper[4776]: I1208 10:29:27.042775 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b833137-7802-4857-a80e-4700ae7bb897-utilities\") pod \"community-operators-h7zl5\" (UID: \"6b833137-7802-4857-a80e-4700ae7bb897\") " pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:27 crc kubenswrapper[4776]: I1208 10:29:27.062425 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9m4w\" (UniqueName: \"kubernetes.io/projected/6b833137-7802-4857-a80e-4700ae7bb897-kube-api-access-c9m4w\") pod \"community-operators-h7zl5\" (UID: \"6b833137-7802-4857-a80e-4700ae7bb897\") " pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:27 crc kubenswrapper[4776]: I1208 10:29:27.128758 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:27 crc kubenswrapper[4776]: I1208 10:29:27.697937 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h7zl5"] Dec 08 10:29:28 crc kubenswrapper[4776]: W1208 10:29:28.558804 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b833137_7802_4857_a80e_4700ae7bb897.slice/crio-325cfb74ed452cbac8d556766148986c6d6ec3b37215bde7ca7f579a1f058c39 WatchSource:0}: Error finding container 325cfb74ed452cbac8d556766148986c6d6ec3b37215bde7ca7f579a1f058c39: Status 404 returned error can't find the container with id 325cfb74ed452cbac8d556766148986c6d6ec3b37215bde7ca7f579a1f058c39 Dec 08 10:29:29 crc kubenswrapper[4776]: I1208 10:29:29.197209 4776 generic.go:334] "Generic (PLEG): container finished" podID="6b833137-7802-4857-a80e-4700ae7bb897" containerID="2b194fc930a6675b8422084561ef4be8759fc7e3d9e47b02f85666d7e6d0781a" exitCode=0 Dec 08 10:29:29 crc kubenswrapper[4776]: I1208 10:29:29.197258 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7zl5" event={"ID":"6b833137-7802-4857-a80e-4700ae7bb897","Type":"ContainerDied","Data":"2b194fc930a6675b8422084561ef4be8759fc7e3d9e47b02f85666d7e6d0781a"} Dec 08 10:29:29 crc kubenswrapper[4776]: I1208 10:29:29.197292 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7zl5" event={"ID":"6b833137-7802-4857-a80e-4700ae7bb897","Type":"ContainerStarted","Data":"325cfb74ed452cbac8d556766148986c6d6ec3b37215bde7ca7f579a1f058c39"} Dec 08 10:29:30 crc kubenswrapper[4776]: I1208 10:29:30.209796 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7zl5" event={"ID":"6b833137-7802-4857-a80e-4700ae7bb897","Type":"ContainerStarted","Data":"c7a102c0ea8aa7785d84ce9068ad3ec9f4a60f6a87949657d80b951601700af4"} Dec 08 10:29:31 crc kubenswrapper[4776]: I1208 10:29:31.223639 4776 generic.go:334] "Generic (PLEG): container finished" podID="6b833137-7802-4857-a80e-4700ae7bb897" containerID="c7a102c0ea8aa7785d84ce9068ad3ec9f4a60f6a87949657d80b951601700af4" exitCode=0 Dec 08 10:29:31 crc kubenswrapper[4776]: I1208 10:29:31.223732 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7zl5" event={"ID":"6b833137-7802-4857-a80e-4700ae7bb897","Type":"ContainerDied","Data":"c7a102c0ea8aa7785d84ce9068ad3ec9f4a60f6a87949657d80b951601700af4"} Dec 08 10:29:32 crc kubenswrapper[4776]: I1208 10:29:32.235945 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7zl5" event={"ID":"6b833137-7802-4857-a80e-4700ae7bb897","Type":"ContainerStarted","Data":"951f480cfc23e099bb4eb938f447e3d91094c646d0cb8e5dd5a19fd4f2c19353"} Dec 08 10:29:32 crc kubenswrapper[4776]: I1208 10:29:32.262029 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h7zl5" podStartSLOduration=3.869982528 podStartE2EDuration="6.262008578s" podCreationTimestamp="2025-12-08 10:29:26 +0000 UTC" firstStartedPulling="2025-12-08 10:29:29.198738692 +0000 UTC m=+5445.461963714" lastFinishedPulling="2025-12-08 10:29:31.590764722 +0000 UTC m=+5447.853989764" observedRunningTime="2025-12-08 10:29:32.25256552 +0000 UTC m=+5448.515790532" watchObservedRunningTime="2025-12-08 10:29:32.262008578 +0000 UTC m=+5448.525233600" Dec 08 10:29:37 crc kubenswrapper[4776]: I1208 10:29:37.129504 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:37 crc kubenswrapper[4776]: I1208 10:29:37.130061 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:37 crc kubenswrapper[4776]: I1208 10:29:37.188314 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:37 crc kubenswrapper[4776]: I1208 10:29:37.336664 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:37 crc kubenswrapper[4776]: I1208 10:29:37.422786 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h7zl5"] Dec 08 10:29:39 crc kubenswrapper[4776]: I1208 10:29:39.309158 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h7zl5" podUID="6b833137-7802-4857-a80e-4700ae7bb897" containerName="registry-server" containerID="cri-o://951f480cfc23e099bb4eb938f447e3d91094c646d0cb8e5dd5a19fd4f2c19353" gracePeriod=2 Dec 08 10:29:39 crc kubenswrapper[4776]: I1208 10:29:39.863670 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:39 crc kubenswrapper[4776]: I1208 10:29:39.949025 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b833137-7802-4857-a80e-4700ae7bb897-utilities\") pod \"6b833137-7802-4857-a80e-4700ae7bb897\" (UID: \"6b833137-7802-4857-a80e-4700ae7bb897\") " Dec 08 10:29:39 crc kubenswrapper[4776]: I1208 10:29:39.949236 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9m4w\" (UniqueName: \"kubernetes.io/projected/6b833137-7802-4857-a80e-4700ae7bb897-kube-api-access-c9m4w\") pod \"6b833137-7802-4857-a80e-4700ae7bb897\" (UID: \"6b833137-7802-4857-a80e-4700ae7bb897\") " Dec 08 10:29:39 crc kubenswrapper[4776]: I1208 10:29:39.951271 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b833137-7802-4857-a80e-4700ae7bb897-utilities" (OuterVolumeSpecName: "utilities") pod "6b833137-7802-4857-a80e-4700ae7bb897" (UID: "6b833137-7802-4857-a80e-4700ae7bb897"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:29:39 crc kubenswrapper[4776]: I1208 10:29:39.972704 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b833137-7802-4857-a80e-4700ae7bb897-kube-api-access-c9m4w" (OuterVolumeSpecName: "kube-api-access-c9m4w") pod "6b833137-7802-4857-a80e-4700ae7bb897" (UID: "6b833137-7802-4857-a80e-4700ae7bb897"). InnerVolumeSpecName "kube-api-access-c9m4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.051272 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b833137-7802-4857-a80e-4700ae7bb897-catalog-content\") pod \"6b833137-7802-4857-a80e-4700ae7bb897\" (UID: \"6b833137-7802-4857-a80e-4700ae7bb897\") " Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.051878 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b833137-7802-4857-a80e-4700ae7bb897-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.051900 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9m4w\" (UniqueName: \"kubernetes.io/projected/6b833137-7802-4857-a80e-4700ae7bb897-kube-api-access-c9m4w\") on node \"crc\" DevicePath \"\"" Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.118080 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b833137-7802-4857-a80e-4700ae7bb897-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b833137-7802-4857-a80e-4700ae7bb897" (UID: "6b833137-7802-4857-a80e-4700ae7bb897"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.154382 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b833137-7802-4857-a80e-4700ae7bb897-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.319749 4776 generic.go:334] "Generic (PLEG): container finished" podID="6b833137-7802-4857-a80e-4700ae7bb897" containerID="951f480cfc23e099bb4eb938f447e3d91094c646d0cb8e5dd5a19fd4f2c19353" exitCode=0 Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.319791 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7zl5" Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.319793 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7zl5" event={"ID":"6b833137-7802-4857-a80e-4700ae7bb897","Type":"ContainerDied","Data":"951f480cfc23e099bb4eb938f447e3d91094c646d0cb8e5dd5a19fd4f2c19353"} Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.319829 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7zl5" event={"ID":"6b833137-7802-4857-a80e-4700ae7bb897","Type":"ContainerDied","Data":"325cfb74ed452cbac8d556766148986c6d6ec3b37215bde7ca7f579a1f058c39"} Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.319848 4776 scope.go:117] "RemoveContainer" containerID="951f480cfc23e099bb4eb938f447e3d91094c646d0cb8e5dd5a19fd4f2c19353" Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.342286 4776 scope.go:117] "RemoveContainer" containerID="c7a102c0ea8aa7785d84ce9068ad3ec9f4a60f6a87949657d80b951601700af4" Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.368546 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h7zl5"] Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.376435 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h7zl5"] Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.382257 4776 scope.go:117] "RemoveContainer" containerID="2b194fc930a6675b8422084561ef4be8759fc7e3d9e47b02f85666d7e6d0781a" Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.439968 4776 scope.go:117] "RemoveContainer" containerID="951f480cfc23e099bb4eb938f447e3d91094c646d0cb8e5dd5a19fd4f2c19353" Dec 08 10:29:40 crc kubenswrapper[4776]: E1208 10:29:40.440806 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"951f480cfc23e099bb4eb938f447e3d91094c646d0cb8e5dd5a19fd4f2c19353\": container with ID starting with 951f480cfc23e099bb4eb938f447e3d91094c646d0cb8e5dd5a19fd4f2c19353 not found: ID does not exist" containerID="951f480cfc23e099bb4eb938f447e3d91094c646d0cb8e5dd5a19fd4f2c19353" Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.441094 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951f480cfc23e099bb4eb938f447e3d91094c646d0cb8e5dd5a19fd4f2c19353"} err="failed to get container status \"951f480cfc23e099bb4eb938f447e3d91094c646d0cb8e5dd5a19fd4f2c19353\": rpc error: code = NotFound desc = could not find container \"951f480cfc23e099bb4eb938f447e3d91094c646d0cb8e5dd5a19fd4f2c19353\": container with ID starting with 951f480cfc23e099bb4eb938f447e3d91094c646d0cb8e5dd5a19fd4f2c19353 not found: ID does not exist" Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.441227 4776 scope.go:117] "RemoveContainer" containerID="c7a102c0ea8aa7785d84ce9068ad3ec9f4a60f6a87949657d80b951601700af4" Dec 08 10:29:40 crc kubenswrapper[4776]: E1208 10:29:40.441794 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7a102c0ea8aa7785d84ce9068ad3ec9f4a60f6a87949657d80b951601700af4\": container with ID starting with c7a102c0ea8aa7785d84ce9068ad3ec9f4a60f6a87949657d80b951601700af4 not found: ID does not exist" containerID="c7a102c0ea8aa7785d84ce9068ad3ec9f4a60f6a87949657d80b951601700af4" Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.441834 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7a102c0ea8aa7785d84ce9068ad3ec9f4a60f6a87949657d80b951601700af4"} err="failed to get container status \"c7a102c0ea8aa7785d84ce9068ad3ec9f4a60f6a87949657d80b951601700af4\": rpc error: code = NotFound desc = could not find container \"c7a102c0ea8aa7785d84ce9068ad3ec9f4a60f6a87949657d80b951601700af4\": container with ID starting with c7a102c0ea8aa7785d84ce9068ad3ec9f4a60f6a87949657d80b951601700af4 not found: ID does not exist" Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.441863 4776 scope.go:117] "RemoveContainer" containerID="2b194fc930a6675b8422084561ef4be8759fc7e3d9e47b02f85666d7e6d0781a" Dec 08 10:29:40 crc kubenswrapper[4776]: E1208 10:29:40.442369 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b194fc930a6675b8422084561ef4be8759fc7e3d9e47b02f85666d7e6d0781a\": container with ID starting with 2b194fc930a6675b8422084561ef4be8759fc7e3d9e47b02f85666d7e6d0781a not found: ID does not exist" containerID="2b194fc930a6675b8422084561ef4be8759fc7e3d9e47b02f85666d7e6d0781a" Dec 08 10:29:40 crc kubenswrapper[4776]: I1208 10:29:40.442390 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b194fc930a6675b8422084561ef4be8759fc7e3d9e47b02f85666d7e6d0781a"} err="failed to get container status \"2b194fc930a6675b8422084561ef4be8759fc7e3d9e47b02f85666d7e6d0781a\": rpc error: code = NotFound desc = could not find container \"2b194fc930a6675b8422084561ef4be8759fc7e3d9e47b02f85666d7e6d0781a\": container with ID starting with 2b194fc930a6675b8422084561ef4be8759fc7e3d9e47b02f85666d7e6d0781a not found: ID does not exist" Dec 08 10:29:42 crc kubenswrapper[4776]: I1208 10:29:42.365091 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b833137-7802-4857-a80e-4700ae7bb897" path="/var/lib/kubelet/pods/6b833137-7802-4857-a80e-4700ae7bb897/volumes" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.222914 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf"] Dec 08 10:30:00 crc kubenswrapper[4776]: E1208 10:30:00.224064 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b833137-7802-4857-a80e-4700ae7bb897" containerName="extract-utilities" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.224080 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b833137-7802-4857-a80e-4700ae7bb897" containerName="extract-utilities" Dec 08 10:30:00 crc kubenswrapper[4776]: E1208 10:30:00.224117 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b833137-7802-4857-a80e-4700ae7bb897" containerName="extract-content" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.224125 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b833137-7802-4857-a80e-4700ae7bb897" containerName="extract-content" Dec 08 10:30:00 crc kubenswrapper[4776]: E1208 10:30:00.224151 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b833137-7802-4857-a80e-4700ae7bb897" containerName="registry-server" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.224159 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b833137-7802-4857-a80e-4700ae7bb897" containerName="registry-server" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.224509 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b833137-7802-4857-a80e-4700ae7bb897" containerName="registry-server" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.225691 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.245210 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.249214 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.359545 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-config-volume\") pod \"collect-profiles-29419830-l5nvf\" (UID: \"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.379832 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-secret-volume\") pod \"collect-profiles-29419830-l5nvf\" (UID: \"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.379937 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfc5v\" (UniqueName: \"kubernetes.io/projected/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-kube-api-access-kfc5v\") pod \"collect-profiles-29419830-l5nvf\" (UID: \"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.466294 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf"] Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.482133 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfc5v\" (UniqueName: \"kubernetes.io/projected/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-kube-api-access-kfc5v\") pod \"collect-profiles-29419830-l5nvf\" (UID: \"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.482526 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-config-volume\") pod \"collect-profiles-29419830-l5nvf\" (UID: \"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.482580 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-secret-volume\") pod \"collect-profiles-29419830-l5nvf\" (UID: \"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.484044 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-config-volume\") pod \"collect-profiles-29419830-l5nvf\" (UID: \"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.489515 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-secret-volume\") pod \"collect-profiles-29419830-l5nvf\" (UID: \"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.509805 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfc5v\" (UniqueName: \"kubernetes.io/projected/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-kube-api-access-kfc5v\") pod \"collect-profiles-29419830-l5nvf\" (UID: \"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf" Dec 08 10:30:00 crc kubenswrapper[4776]: I1208 10:30:00.572764 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf" Dec 08 10:30:01 crc kubenswrapper[4776]: I1208 10:30:01.055722 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf"] Dec 08 10:30:01 crc kubenswrapper[4776]: I1208 10:30:01.568522 4776 generic.go:334] "Generic (PLEG): container finished" podID="c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9" containerID="bfc10deba860bf0e3257ab3b9fe8c7e1a1c6c75b38c72ea61e3ff28ba69bb378" exitCode=0 Dec 08 10:30:01 crc kubenswrapper[4776]: I1208 10:30:01.568588 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf" event={"ID":"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9","Type":"ContainerDied","Data":"bfc10deba860bf0e3257ab3b9fe8c7e1a1c6c75b38c72ea61e3ff28ba69bb378"} Dec 08 10:30:01 crc kubenswrapper[4776]: I1208 10:30:01.568805 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf" event={"ID":"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9","Type":"ContainerStarted","Data":"05d981d105ca4bfee0cd767846583c3c23e79d2fe13c4b5391005fc470f6a676"} Dec 08 10:30:03 crc kubenswrapper[4776]: I1208 10:30:03.022570 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf" Dec 08 10:30:03 crc kubenswrapper[4776]: I1208 10:30:03.051606 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-secret-volume\") pod \"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9\" (UID: \"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9\") " Dec 08 10:30:03 crc kubenswrapper[4776]: I1208 10:30:03.051813 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfc5v\" (UniqueName: \"kubernetes.io/projected/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-kube-api-access-kfc5v\") pod \"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9\" (UID: \"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9\") " Dec 08 10:30:03 crc kubenswrapper[4776]: I1208 10:30:03.053296 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-config-volume\") pod \"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9\" (UID: \"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9\") " Dec 08 10:30:03 crc kubenswrapper[4776]: I1208 10:30:03.055478 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-config-volume" (OuterVolumeSpecName: "config-volume") pod "c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9" (UID: "c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 10:30:03 crc kubenswrapper[4776]: I1208 10:30:03.064141 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9" (UID: "c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 10:30:03 crc kubenswrapper[4776]: I1208 10:30:03.069987 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-kube-api-access-kfc5v" (OuterVolumeSpecName: "kube-api-access-kfc5v") pod "c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9" (UID: "c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9"). InnerVolumeSpecName "kube-api-access-kfc5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:30:03 crc kubenswrapper[4776]: I1208 10:30:03.157046 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfc5v\" (UniqueName: \"kubernetes.io/projected/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-kube-api-access-kfc5v\") on node \"crc\" DevicePath \"\"" Dec 08 10:30:03 crc kubenswrapper[4776]: I1208 10:30:03.157084 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-config-volume\") on node \"crc\" DevicePath \"\"" Dec 08 10:30:03 crc kubenswrapper[4776]: I1208 10:30:03.157097 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 08 10:30:03 crc kubenswrapper[4776]: I1208 10:30:03.590215 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf" event={"ID":"c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9","Type":"ContainerDied","Data":"05d981d105ca4bfee0cd767846583c3c23e79d2fe13c4b5391005fc470f6a676"} Dec 08 10:30:03 crc kubenswrapper[4776]: I1208 10:30:03.590574 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05d981d105ca4bfee0cd767846583c3c23e79d2fe13c4b5391005fc470f6a676" Dec 08 10:30:03 crc kubenswrapper[4776]: I1208 10:30:03.590347 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419830-l5nvf" Dec 08 10:30:04 crc kubenswrapper[4776]: I1208 10:30:04.102706 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2"] Dec 08 10:30:04 crc kubenswrapper[4776]: I1208 10:30:04.113755 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419785-tvxx2"] Dec 08 10:30:04 crc kubenswrapper[4776]: I1208 10:30:04.360352 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48ff7820-516e-4d30-9d51-e9a9c7582c81" path="/var/lib/kubelet/pods/48ff7820-516e-4d30-9d51-e9a9c7582c81/volumes" Dec 08 10:30:11 crc kubenswrapper[4776]: I1208 10:30:11.399580 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:30:11 crc kubenswrapper[4776]: I1208 10:30:11.400189 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:30:20 crc kubenswrapper[4776]: I1208 10:30:20.177235 4776 scope.go:117] "RemoveContainer" containerID="2e8034c8fe50566431108811532369c91f213b857be245374ec4126f41f80494" Dec 08 10:30:32 crc kubenswrapper[4776]: I1208 10:30:32.636038 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d5dnr"] Dec 08 10:30:32 crc kubenswrapper[4776]: E1208 10:30:32.637595 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9" containerName="collect-profiles" Dec 08 10:30:32 crc kubenswrapper[4776]: I1208 10:30:32.637626 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9" containerName="collect-profiles" Dec 08 10:30:32 crc kubenswrapper[4776]: I1208 10:30:32.637936 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b913a4-1ab9-4ed0-9ecc-08df1b91dbf9" containerName="collect-profiles" Dec 08 10:30:32 crc kubenswrapper[4776]: I1208 10:30:32.639975 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d5dnr" Dec 08 10:30:32 crc kubenswrapper[4776]: I1208 10:30:32.654380 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d5dnr"] Dec 08 10:30:32 crc kubenswrapper[4776]: I1208 10:30:32.757659 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnjgq\" (UniqueName: \"kubernetes.io/projected/885bc336-6858-43f4-b63b-155ed1f06b60-kube-api-access-cnjgq\") pod \"redhat-operators-d5dnr\" (UID: \"885bc336-6858-43f4-b63b-155ed1f06b60\") " pod="openshift-marketplace/redhat-operators-d5dnr" Dec 08 10:30:32 crc kubenswrapper[4776]: I1208 10:30:32.757718 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/885bc336-6858-43f4-b63b-155ed1f06b60-utilities\") pod \"redhat-operators-d5dnr\" (UID: \"885bc336-6858-43f4-b63b-155ed1f06b60\") " pod="openshift-marketplace/redhat-operators-d5dnr" Dec 08 10:30:32 crc kubenswrapper[4776]: I1208 10:30:32.757800 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/885bc336-6858-43f4-b63b-155ed1f06b60-catalog-content\") pod \"redhat-operators-d5dnr\" (UID: \"885bc336-6858-43f4-b63b-155ed1f06b60\") " pod="openshift-marketplace/redhat-operators-d5dnr" Dec 08 10:30:32 crc kubenswrapper[4776]: I1208 10:30:32.860057 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/885bc336-6858-43f4-b63b-155ed1f06b60-catalog-content\") pod \"redhat-operators-d5dnr\" (UID: \"885bc336-6858-43f4-b63b-155ed1f06b60\") " pod="openshift-marketplace/redhat-operators-d5dnr" Dec 08 10:30:32 crc kubenswrapper[4776]: I1208 10:30:32.860278 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnjgq\" (UniqueName: \"kubernetes.io/projected/885bc336-6858-43f4-b63b-155ed1f06b60-kube-api-access-cnjgq\") pod \"redhat-operators-d5dnr\" (UID: \"885bc336-6858-43f4-b63b-155ed1f06b60\") " pod="openshift-marketplace/redhat-operators-d5dnr" Dec 08 10:30:32 crc kubenswrapper[4776]: I1208 10:30:32.860317 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/885bc336-6858-43f4-b63b-155ed1f06b60-utilities\") pod \"redhat-operators-d5dnr\" (UID: \"885bc336-6858-43f4-b63b-155ed1f06b60\") " pod="openshift-marketplace/redhat-operators-d5dnr" Dec 08 10:30:32 crc kubenswrapper[4776]: I1208 10:30:32.860626 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/885bc336-6858-43f4-b63b-155ed1f06b60-catalog-content\") pod \"redhat-operators-d5dnr\" (UID: \"885bc336-6858-43f4-b63b-155ed1f06b60\") " pod="openshift-marketplace/redhat-operators-d5dnr" Dec 08 10:30:32 crc kubenswrapper[4776]: I1208 10:30:32.860714 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/885bc336-6858-43f4-b63b-155ed1f06b60-utilities\") pod \"redhat-operators-d5dnr\" (UID: \"885bc336-6858-43f4-b63b-155ed1f06b60\") " pod="openshift-marketplace/redhat-operators-d5dnr" Dec 08 10:30:32 crc kubenswrapper[4776]: I1208 10:30:32.953667 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnjgq\" (UniqueName: \"kubernetes.io/projected/885bc336-6858-43f4-b63b-155ed1f06b60-kube-api-access-cnjgq\") pod \"redhat-operators-d5dnr\" (UID: \"885bc336-6858-43f4-b63b-155ed1f06b60\") " pod="openshift-marketplace/redhat-operators-d5dnr" Dec 08 10:30:32 crc kubenswrapper[4776]: I1208 10:30:32.983730 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d5dnr" Dec 08 10:30:33 crc kubenswrapper[4776]: I1208 10:30:33.547258 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d5dnr"] Dec 08 10:30:33 crc kubenswrapper[4776]: I1208 10:30:33.896499 4776 generic.go:334] "Generic (PLEG): container finished" podID="885bc336-6858-43f4-b63b-155ed1f06b60" containerID="d8144c73536963213370353730216246c218e8194ca7964440b1636875902bd4" exitCode=0 Dec 08 10:30:33 crc kubenswrapper[4776]: I1208 10:30:33.896553 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5dnr" event={"ID":"885bc336-6858-43f4-b63b-155ed1f06b60","Type":"ContainerDied","Data":"d8144c73536963213370353730216246c218e8194ca7964440b1636875902bd4"} Dec 08 10:30:33 crc kubenswrapper[4776]: I1208 10:30:33.896582 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5dnr" event={"ID":"885bc336-6858-43f4-b63b-155ed1f06b60","Type":"ContainerStarted","Data":"59220ff54d0c4cc2d0340d23f1fd41fbe28429e2a7f029489a3da6d491267cac"} Dec 08 10:30:33 crc kubenswrapper[4776]: I1208 10:30:33.898922 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 10:30:38 crc kubenswrapper[4776]: I1208 10:30:38.896969 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-5cfbc4c5f-hhnf9" podUID="aa5389fb-4ae8-45b1-baaf-18f2fea3f61c" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 08 10:30:41 crc kubenswrapper[4776]: I1208 10:30:41.399665 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:30:41 crc kubenswrapper[4776]: I1208 10:30:41.400346 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:30:46 crc kubenswrapper[4776]: I1208 10:30:46.039702 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5dnr" event={"ID":"885bc336-6858-43f4-b63b-155ed1f06b60","Type":"ContainerStarted","Data":"23313f37de4dac4c34a261c69b74a907d676c80c8fa9f29d127e3eae4f93dfcf"} Dec 08 10:30:50 crc kubenswrapper[4776]: I1208 10:30:50.085553 4776 generic.go:334] "Generic (PLEG): container finished" podID="885bc336-6858-43f4-b63b-155ed1f06b60" containerID="23313f37de4dac4c34a261c69b74a907d676c80c8fa9f29d127e3eae4f93dfcf" exitCode=0 Dec 08 10:30:50 crc kubenswrapper[4776]: I1208 10:30:50.085651 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5dnr" event={"ID":"885bc336-6858-43f4-b63b-155ed1f06b60","Type":"ContainerDied","Data":"23313f37de4dac4c34a261c69b74a907d676c80c8fa9f29d127e3eae4f93dfcf"} Dec 08 10:30:51 crc kubenswrapper[4776]: I1208 10:30:51.126433 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5dnr" event={"ID":"885bc336-6858-43f4-b63b-155ed1f06b60","Type":"ContainerStarted","Data":"a7c56bb45763857e34e9e83fc4c0c7b4ec56c8f0d58363ac40aa1a827b6799d2"} Dec 08 10:30:51 crc kubenswrapper[4776]: I1208 10:30:51.156912 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d5dnr" podStartSLOduration=2.1986009539999998 podStartE2EDuration="19.156889757s" podCreationTimestamp="2025-12-08 10:30:32 +0000 UTC" firstStartedPulling="2025-12-08 10:30:33.898648925 +0000 UTC m=+5510.161873947" lastFinishedPulling="2025-12-08 10:30:50.856937728 +0000 UTC m=+5527.120162750" observedRunningTime="2025-12-08 10:30:51.145422503 +0000 UTC m=+5527.408647525" watchObservedRunningTime="2025-12-08 10:30:51.156889757 +0000 UTC m=+5527.420114779" Dec 08 10:30:52 crc kubenswrapper[4776]: I1208 10:30:52.984323 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d5dnr" Dec 08 10:30:52 crc kubenswrapper[4776]: I1208 10:30:52.984370 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d5dnr" Dec 08 10:30:54 crc kubenswrapper[4776]: I1208 10:30:54.047291 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d5dnr" podUID="885bc336-6858-43f4-b63b-155ed1f06b60" containerName="registry-server" probeResult="failure" output=< Dec 08 10:30:54 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 10:30:54 crc kubenswrapper[4776]: > Dec 08 10:31:03 crc kubenswrapper[4776]: I1208 10:31:03.043732 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d5dnr" Dec 08 10:31:03 crc kubenswrapper[4776]: I1208 10:31:03.091991 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d5dnr" Dec 08 10:31:03 crc kubenswrapper[4776]: I1208 10:31:03.704776 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d5dnr"] Dec 08 10:31:03 crc kubenswrapper[4776]: I1208 10:31:03.833130 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wkhft"] Dec 08 10:31:03 crc kubenswrapper[4776]: I1208 10:31:03.833828 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wkhft" podUID="65dfa143-cdae-4009-9f9d-ec37dec2711a" containerName="registry-server" containerID="cri-o://cf5a18071920634401ef02e95baaa683aa47c59f4431bb34853d75df96ae35b2" gracePeriod=2 Dec 08 10:31:04 crc kubenswrapper[4776]: I1208 10:31:04.271214 4776 generic.go:334] "Generic (PLEG): container finished" podID="65dfa143-cdae-4009-9f9d-ec37dec2711a" containerID="cf5a18071920634401ef02e95baaa683aa47c59f4431bb34853d75df96ae35b2" exitCode=0 Dec 08 10:31:04 crc kubenswrapper[4776]: I1208 10:31:04.272605 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkhft" event={"ID":"65dfa143-cdae-4009-9f9d-ec37dec2711a","Type":"ContainerDied","Data":"cf5a18071920634401ef02e95baaa683aa47c59f4431bb34853d75df96ae35b2"} Dec 08 10:31:04 crc kubenswrapper[4776]: I1208 10:31:04.497088 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 10:31:04 crc kubenswrapper[4776]: I1208 10:31:04.538819 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dfa143-cdae-4009-9f9d-ec37dec2711a-utilities\") pod \"65dfa143-cdae-4009-9f9d-ec37dec2711a\" (UID: \"65dfa143-cdae-4009-9f9d-ec37dec2711a\") " Dec 08 10:31:04 crc kubenswrapper[4776]: I1208 10:31:04.539009 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqvs2\" (UniqueName: \"kubernetes.io/projected/65dfa143-cdae-4009-9f9d-ec37dec2711a-kube-api-access-sqvs2\") pod \"65dfa143-cdae-4009-9f9d-ec37dec2711a\" (UID: \"65dfa143-cdae-4009-9f9d-ec37dec2711a\") " Dec 08 10:31:04 crc kubenswrapper[4776]: I1208 10:31:04.539070 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dfa143-cdae-4009-9f9d-ec37dec2711a-catalog-content\") pod \"65dfa143-cdae-4009-9f9d-ec37dec2711a\" (UID: \"65dfa143-cdae-4009-9f9d-ec37dec2711a\") " Dec 08 10:31:04 crc kubenswrapper[4776]: I1208 10:31:04.541724 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65dfa143-cdae-4009-9f9d-ec37dec2711a-utilities" (OuterVolumeSpecName: "utilities") pod "65dfa143-cdae-4009-9f9d-ec37dec2711a" (UID: "65dfa143-cdae-4009-9f9d-ec37dec2711a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:31:04 crc kubenswrapper[4776]: I1208 10:31:04.551398 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65dfa143-cdae-4009-9f9d-ec37dec2711a-kube-api-access-sqvs2" (OuterVolumeSpecName: "kube-api-access-sqvs2") pod "65dfa143-cdae-4009-9f9d-ec37dec2711a" (UID: "65dfa143-cdae-4009-9f9d-ec37dec2711a"). InnerVolumeSpecName "kube-api-access-sqvs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:31:04 crc kubenswrapper[4776]: I1208 10:31:04.650584 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqvs2\" (UniqueName: \"kubernetes.io/projected/65dfa143-cdae-4009-9f9d-ec37dec2711a-kube-api-access-sqvs2\") on node \"crc\" DevicePath \"\"" Dec 08 10:31:04 crc kubenswrapper[4776]: I1208 10:31:04.650945 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dfa143-cdae-4009-9f9d-ec37dec2711a-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:31:04 crc kubenswrapper[4776]: I1208 10:31:04.697600 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65dfa143-cdae-4009-9f9d-ec37dec2711a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65dfa143-cdae-4009-9f9d-ec37dec2711a" (UID: "65dfa143-cdae-4009-9f9d-ec37dec2711a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:31:04 crc kubenswrapper[4776]: I1208 10:31:04.752982 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dfa143-cdae-4009-9f9d-ec37dec2711a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:31:05 crc kubenswrapper[4776]: I1208 10:31:05.285238 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkhft" event={"ID":"65dfa143-cdae-4009-9f9d-ec37dec2711a","Type":"ContainerDied","Data":"b6b23dc6168fd44c5a6c975ff992f248251b51845b2674994a462cdb0dfa49a2"} Dec 08 10:31:05 crc kubenswrapper[4776]: I1208 10:31:05.285291 4776 scope.go:117] "RemoveContainer" containerID="cf5a18071920634401ef02e95baaa683aa47c59f4431bb34853d75df96ae35b2" Dec 08 10:31:05 crc kubenswrapper[4776]: I1208 10:31:05.285367 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkhft" Dec 08 10:31:05 crc kubenswrapper[4776]: I1208 10:31:05.316149 4776 scope.go:117] "RemoveContainer" containerID="e4f0fd5b4d4b13c28b71913919420afc99b2725f3ede74f5a894fce9374686e2" Dec 08 10:31:05 crc kubenswrapper[4776]: I1208 10:31:05.323135 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wkhft"] Dec 08 10:31:05 crc kubenswrapper[4776]: I1208 10:31:05.336424 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wkhft"] Dec 08 10:31:05 crc kubenswrapper[4776]: I1208 10:31:05.342356 4776 scope.go:117] "RemoveContainer" containerID="3142d94fd6e0c2cc0ce5b0d356886b77b149691ea585f9bf9b472f80238bab1e" Dec 08 10:31:06 crc kubenswrapper[4776]: I1208 10:31:06.357992 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65dfa143-cdae-4009-9f9d-ec37dec2711a" path="/var/lib/kubelet/pods/65dfa143-cdae-4009-9f9d-ec37dec2711a/volumes" Dec 08 10:31:11 crc kubenswrapper[4776]: I1208 10:31:11.399000 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:31:11 crc kubenswrapper[4776]: I1208 10:31:11.399603 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:31:11 crc kubenswrapper[4776]: I1208 10:31:11.399657 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 10:31:11 crc kubenswrapper[4776]: I1208 10:31:11.401373 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"64895e0778b6450b6c551f116e434d2513cfc42f72a2ea700de1a1fdd4e6f67b"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 10:31:11 crc kubenswrapper[4776]: I1208 10:31:11.401521 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://64895e0778b6450b6c551f116e434d2513cfc42f72a2ea700de1a1fdd4e6f67b" gracePeriod=600 Dec 08 10:31:12 crc kubenswrapper[4776]: I1208 10:31:12.361840 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="64895e0778b6450b6c551f116e434d2513cfc42f72a2ea700de1a1fdd4e6f67b" exitCode=0 Dec 08 10:31:12 crc kubenswrapper[4776]: I1208 10:31:12.361874 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"64895e0778b6450b6c551f116e434d2513cfc42f72a2ea700de1a1fdd4e6f67b"} Dec 08 10:31:12 crc kubenswrapper[4776]: I1208 10:31:12.362261 4776 scope.go:117] "RemoveContainer" containerID="ba2de61c4584af05bb1541b86bdc85deedaeb51f4cbdde10a32dbe542154d0e0" Dec 08 10:31:13 crc kubenswrapper[4776]: I1208 10:31:13.373481 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7"} Dec 08 10:31:33 crc kubenswrapper[4776]: I1208 10:31:33.724333 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w86h2"] Dec 08 10:31:33 crc kubenswrapper[4776]: E1208 10:31:33.725639 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65dfa143-cdae-4009-9f9d-ec37dec2711a" containerName="registry-server" Dec 08 10:31:33 crc kubenswrapper[4776]: I1208 10:31:33.725657 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="65dfa143-cdae-4009-9f9d-ec37dec2711a" containerName="registry-server" Dec 08 10:31:33 crc kubenswrapper[4776]: E1208 10:31:33.725685 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65dfa143-cdae-4009-9f9d-ec37dec2711a" containerName="extract-utilities" Dec 08 10:31:33 crc kubenswrapper[4776]: I1208 10:31:33.725691 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="65dfa143-cdae-4009-9f9d-ec37dec2711a" containerName="extract-utilities" Dec 08 10:31:33 crc kubenswrapper[4776]: E1208 10:31:33.725722 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65dfa143-cdae-4009-9f9d-ec37dec2711a" containerName="extract-content" Dec 08 10:31:33 crc kubenswrapper[4776]: I1208 10:31:33.725727 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="65dfa143-cdae-4009-9f9d-ec37dec2711a" containerName="extract-content" Dec 08 10:31:33 crc kubenswrapper[4776]: I1208 10:31:33.725947 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="65dfa143-cdae-4009-9f9d-ec37dec2711a" containerName="registry-server" Dec 08 10:31:33 crc kubenswrapper[4776]: I1208 10:31:33.728843 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:33 crc kubenswrapper[4776]: I1208 10:31:33.745547 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w86h2"] Dec 08 10:31:33 crc kubenswrapper[4776]: I1208 10:31:33.847459 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58776d44-4160-43f0-afd9-00f3eb74cb92-catalog-content\") pod \"certified-operators-w86h2\" (UID: \"58776d44-4160-43f0-afd9-00f3eb74cb92\") " pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:33 crc kubenswrapper[4776]: I1208 10:31:33.847687 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58776d44-4160-43f0-afd9-00f3eb74cb92-utilities\") pod \"certified-operators-w86h2\" (UID: \"58776d44-4160-43f0-afd9-00f3eb74cb92\") " pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:33 crc kubenswrapper[4776]: I1208 10:31:33.847801 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvhch\" (UniqueName: \"kubernetes.io/projected/58776d44-4160-43f0-afd9-00f3eb74cb92-kube-api-access-kvhch\") pod \"certified-operators-w86h2\" (UID: \"58776d44-4160-43f0-afd9-00f3eb74cb92\") " pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:33 crc kubenswrapper[4776]: I1208 10:31:33.949663 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58776d44-4160-43f0-afd9-00f3eb74cb92-utilities\") pod \"certified-operators-w86h2\" (UID: \"58776d44-4160-43f0-afd9-00f3eb74cb92\") " pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:33 crc kubenswrapper[4776]: I1208 10:31:33.949797 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvhch\" (UniqueName: \"kubernetes.io/projected/58776d44-4160-43f0-afd9-00f3eb74cb92-kube-api-access-kvhch\") pod \"certified-operators-w86h2\" (UID: \"58776d44-4160-43f0-afd9-00f3eb74cb92\") " pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:33 crc kubenswrapper[4776]: I1208 10:31:33.949869 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58776d44-4160-43f0-afd9-00f3eb74cb92-catalog-content\") pod \"certified-operators-w86h2\" (UID: \"58776d44-4160-43f0-afd9-00f3eb74cb92\") " pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:33 crc kubenswrapper[4776]: I1208 10:31:33.950853 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58776d44-4160-43f0-afd9-00f3eb74cb92-utilities\") pod \"certified-operators-w86h2\" (UID: \"58776d44-4160-43f0-afd9-00f3eb74cb92\") " pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:33 crc kubenswrapper[4776]: I1208 10:31:33.950944 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58776d44-4160-43f0-afd9-00f3eb74cb92-catalog-content\") pod \"certified-operators-w86h2\" (UID: \"58776d44-4160-43f0-afd9-00f3eb74cb92\") " pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:34 crc kubenswrapper[4776]: I1208 10:31:34.053386 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvhch\" (UniqueName: \"kubernetes.io/projected/58776d44-4160-43f0-afd9-00f3eb74cb92-kube-api-access-kvhch\") pod \"certified-operators-w86h2\" (UID: \"58776d44-4160-43f0-afd9-00f3eb74cb92\") " pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:34 crc kubenswrapper[4776]: I1208 10:31:34.062656 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:34 crc kubenswrapper[4776]: I1208 10:31:34.634406 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w86h2"] Dec 08 10:31:35 crc kubenswrapper[4776]: I1208 10:31:35.610258 4776 generic.go:334] "Generic (PLEG): container finished" podID="58776d44-4160-43f0-afd9-00f3eb74cb92" containerID="529af47f5fb06b40f5904154e06621a1e0fa6a35476b2065c38b36258a0cf978" exitCode=0 Dec 08 10:31:35 crc kubenswrapper[4776]: I1208 10:31:35.610283 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w86h2" event={"ID":"58776d44-4160-43f0-afd9-00f3eb74cb92","Type":"ContainerDied","Data":"529af47f5fb06b40f5904154e06621a1e0fa6a35476b2065c38b36258a0cf978"} Dec 08 10:31:35 crc kubenswrapper[4776]: I1208 10:31:35.610669 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w86h2" event={"ID":"58776d44-4160-43f0-afd9-00f3eb74cb92","Type":"ContainerStarted","Data":"6d32d6f8c7262c643aeac65a681fa5650312541ad50dcfa0458fa0605fca766c"} Dec 08 10:31:36 crc kubenswrapper[4776]: I1208 10:31:36.624280 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w86h2" event={"ID":"58776d44-4160-43f0-afd9-00f3eb74cb92","Type":"ContainerStarted","Data":"aa1483f661c5b74bdf0a9bc488f669a569d068c5da843d272911726ec1c11735"} Dec 08 10:31:38 crc kubenswrapper[4776]: I1208 10:31:38.644089 4776 generic.go:334] "Generic (PLEG): container finished" podID="58776d44-4160-43f0-afd9-00f3eb74cb92" containerID="aa1483f661c5b74bdf0a9bc488f669a569d068c5da843d272911726ec1c11735" exitCode=0 Dec 08 10:31:38 crc kubenswrapper[4776]: I1208 10:31:38.644167 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w86h2" event={"ID":"58776d44-4160-43f0-afd9-00f3eb74cb92","Type":"ContainerDied","Data":"aa1483f661c5b74bdf0a9bc488f669a569d068c5da843d272911726ec1c11735"} Dec 08 10:31:39 crc kubenswrapper[4776]: I1208 10:31:39.656659 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w86h2" event={"ID":"58776d44-4160-43f0-afd9-00f3eb74cb92","Type":"ContainerStarted","Data":"7747fc76f906fa6b03a3afe9140a61bcf2a9a595ea21d6c80bdd6a3765884642"} Dec 08 10:31:39 crc kubenswrapper[4776]: I1208 10:31:39.671769 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w86h2" podStartSLOduration=3.271125815 podStartE2EDuration="6.671754103s" podCreationTimestamp="2025-12-08 10:31:33 +0000 UTC" firstStartedPulling="2025-12-08 10:31:35.612454233 +0000 UTC m=+5571.875679255" lastFinishedPulling="2025-12-08 10:31:39.013082481 +0000 UTC m=+5575.276307543" observedRunningTime="2025-12-08 10:31:39.670474369 +0000 UTC m=+5575.933699391" watchObservedRunningTime="2025-12-08 10:31:39.671754103 +0000 UTC m=+5575.934979125" Dec 08 10:31:44 crc kubenswrapper[4776]: I1208 10:31:44.063620 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:44 crc kubenswrapper[4776]: I1208 10:31:44.064243 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:44 crc kubenswrapper[4776]: I1208 10:31:44.122051 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:44 crc kubenswrapper[4776]: I1208 10:31:44.759144 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:44 crc kubenswrapper[4776]: I1208 10:31:44.801797 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w86h2"] Dec 08 10:31:46 crc kubenswrapper[4776]: I1208 10:31:46.732930 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w86h2" podUID="58776d44-4160-43f0-afd9-00f3eb74cb92" containerName="registry-server" containerID="cri-o://7747fc76f906fa6b03a3afe9140a61bcf2a9a595ea21d6c80bdd6a3765884642" gracePeriod=2 Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.236672 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.375597 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58776d44-4160-43f0-afd9-00f3eb74cb92-catalog-content\") pod \"58776d44-4160-43f0-afd9-00f3eb74cb92\" (UID: \"58776d44-4160-43f0-afd9-00f3eb74cb92\") " Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.375970 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvhch\" (UniqueName: \"kubernetes.io/projected/58776d44-4160-43f0-afd9-00f3eb74cb92-kube-api-access-kvhch\") pod \"58776d44-4160-43f0-afd9-00f3eb74cb92\" (UID: \"58776d44-4160-43f0-afd9-00f3eb74cb92\") " Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.376422 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58776d44-4160-43f0-afd9-00f3eb74cb92-utilities\") pod \"58776d44-4160-43f0-afd9-00f3eb74cb92\" (UID: \"58776d44-4160-43f0-afd9-00f3eb74cb92\") " Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.377015 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58776d44-4160-43f0-afd9-00f3eb74cb92-utilities" (OuterVolumeSpecName: "utilities") pod "58776d44-4160-43f0-afd9-00f3eb74cb92" (UID: "58776d44-4160-43f0-afd9-00f3eb74cb92"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.377262 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58776d44-4160-43f0-afd9-00f3eb74cb92-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.381979 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58776d44-4160-43f0-afd9-00f3eb74cb92-kube-api-access-kvhch" (OuterVolumeSpecName: "kube-api-access-kvhch") pod "58776d44-4160-43f0-afd9-00f3eb74cb92" (UID: "58776d44-4160-43f0-afd9-00f3eb74cb92"). InnerVolumeSpecName "kube-api-access-kvhch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.430454 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58776d44-4160-43f0-afd9-00f3eb74cb92-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58776d44-4160-43f0-afd9-00f3eb74cb92" (UID: "58776d44-4160-43f0-afd9-00f3eb74cb92"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.479461 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58776d44-4160-43f0-afd9-00f3eb74cb92-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.479502 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvhch\" (UniqueName: \"kubernetes.io/projected/58776d44-4160-43f0-afd9-00f3eb74cb92-kube-api-access-kvhch\") on node \"crc\" DevicePath \"\"" Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.744361 4776 generic.go:334] "Generic (PLEG): container finished" podID="58776d44-4160-43f0-afd9-00f3eb74cb92" containerID="7747fc76f906fa6b03a3afe9140a61bcf2a9a595ea21d6c80bdd6a3765884642" exitCode=0 Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.744465 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w86h2" Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.744472 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w86h2" event={"ID":"58776d44-4160-43f0-afd9-00f3eb74cb92","Type":"ContainerDied","Data":"7747fc76f906fa6b03a3afe9140a61bcf2a9a595ea21d6c80bdd6a3765884642"} Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.745393 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w86h2" event={"ID":"58776d44-4160-43f0-afd9-00f3eb74cb92","Type":"ContainerDied","Data":"6d32d6f8c7262c643aeac65a681fa5650312541ad50dcfa0458fa0605fca766c"} Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.745416 4776 scope.go:117] "RemoveContainer" containerID="7747fc76f906fa6b03a3afe9140a61bcf2a9a595ea21d6c80bdd6a3765884642" Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.777454 4776 scope.go:117] "RemoveContainer" containerID="aa1483f661c5b74bdf0a9bc488f669a569d068c5da843d272911726ec1c11735" Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.785840 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w86h2"] Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.798659 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w86h2"] Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.813961 4776 scope.go:117] "RemoveContainer" containerID="529af47f5fb06b40f5904154e06621a1e0fa6a35476b2065c38b36258a0cf978" Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.850229 4776 scope.go:117] "RemoveContainer" containerID="7747fc76f906fa6b03a3afe9140a61bcf2a9a595ea21d6c80bdd6a3765884642" Dec 08 10:31:47 crc kubenswrapper[4776]: E1208 10:31:47.850606 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7747fc76f906fa6b03a3afe9140a61bcf2a9a595ea21d6c80bdd6a3765884642\": container with ID starting with 7747fc76f906fa6b03a3afe9140a61bcf2a9a595ea21d6c80bdd6a3765884642 not found: ID does not exist" containerID="7747fc76f906fa6b03a3afe9140a61bcf2a9a595ea21d6c80bdd6a3765884642" Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.850724 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7747fc76f906fa6b03a3afe9140a61bcf2a9a595ea21d6c80bdd6a3765884642"} err="failed to get container status \"7747fc76f906fa6b03a3afe9140a61bcf2a9a595ea21d6c80bdd6a3765884642\": rpc error: code = NotFound desc = could not find container \"7747fc76f906fa6b03a3afe9140a61bcf2a9a595ea21d6c80bdd6a3765884642\": container with ID starting with 7747fc76f906fa6b03a3afe9140a61bcf2a9a595ea21d6c80bdd6a3765884642 not found: ID does not exist" Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.850837 4776 scope.go:117] "RemoveContainer" containerID="aa1483f661c5b74bdf0a9bc488f669a569d068c5da843d272911726ec1c11735" Dec 08 10:31:47 crc kubenswrapper[4776]: E1208 10:31:47.851209 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa1483f661c5b74bdf0a9bc488f669a569d068c5da843d272911726ec1c11735\": container with ID starting with aa1483f661c5b74bdf0a9bc488f669a569d068c5da843d272911726ec1c11735 not found: ID does not exist" containerID="aa1483f661c5b74bdf0a9bc488f669a569d068c5da843d272911726ec1c11735" Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.851252 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa1483f661c5b74bdf0a9bc488f669a569d068c5da843d272911726ec1c11735"} err="failed to get container status \"aa1483f661c5b74bdf0a9bc488f669a569d068c5da843d272911726ec1c11735\": rpc error: code = NotFound desc = could not find container \"aa1483f661c5b74bdf0a9bc488f669a569d068c5da843d272911726ec1c11735\": container with ID starting with aa1483f661c5b74bdf0a9bc488f669a569d068c5da843d272911726ec1c11735 not found: ID does not exist" Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.851279 4776 scope.go:117] "RemoveContainer" containerID="529af47f5fb06b40f5904154e06621a1e0fa6a35476b2065c38b36258a0cf978" Dec 08 10:31:47 crc kubenswrapper[4776]: E1208 10:31:47.851608 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"529af47f5fb06b40f5904154e06621a1e0fa6a35476b2065c38b36258a0cf978\": container with ID starting with 529af47f5fb06b40f5904154e06621a1e0fa6a35476b2065c38b36258a0cf978 not found: ID does not exist" containerID="529af47f5fb06b40f5904154e06621a1e0fa6a35476b2065c38b36258a0cf978" Dec 08 10:31:47 crc kubenswrapper[4776]: I1208 10:31:47.851649 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"529af47f5fb06b40f5904154e06621a1e0fa6a35476b2065c38b36258a0cf978"} err="failed to get container status \"529af47f5fb06b40f5904154e06621a1e0fa6a35476b2065c38b36258a0cf978\": rpc error: code = NotFound desc = could not find container \"529af47f5fb06b40f5904154e06621a1e0fa6a35476b2065c38b36258a0cf978\": container with ID starting with 529af47f5fb06b40f5904154e06621a1e0fa6a35476b2065c38b36258a0cf978 not found: ID does not exist" Dec 08 10:31:48 crc kubenswrapper[4776]: I1208 10:31:48.357268 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58776d44-4160-43f0-afd9-00f3eb74cb92" path="/var/lib/kubelet/pods/58776d44-4160-43f0-afd9-00f3eb74cb92/volumes" Dec 08 10:33:03 crc kubenswrapper[4776]: I1208 10:33:03.530649 4776 generic.go:334] "Generic (PLEG): container finished" podID="9c3d4f25-4353-4b82-8de9-ee14a2f05076" containerID="157cf2545ec6bd46b1bfefd27109b4c9ebba70710924cdeb18081d269f535371" exitCode=0 Dec 08 10:33:03 crc kubenswrapper[4776]: I1208 10:33:03.530727 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9c3d4f25-4353-4b82-8de9-ee14a2f05076","Type":"ContainerDied","Data":"157cf2545ec6bd46b1bfefd27109b4c9ebba70710924cdeb18081d269f535371"} Dec 08 10:33:04 crc kubenswrapper[4776]: I1208 10:33:04.972083 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.078903 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c3d4f25-4353-4b82-8de9-ee14a2f05076-config-data\") pod \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.079062 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-ssh-key\") pod \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.079421 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9c3d4f25-4353-4b82-8de9-ee14a2f05076-test-operator-ephemeral-temporary\") pod \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.079593 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8fp8\" (UniqueName: \"kubernetes.io/projected/9c3d4f25-4353-4b82-8de9-ee14a2f05076-kube-api-access-q8fp8\") pod \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.079646 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.079718 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-openstack-config-secret\") pod \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.079835 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9c3d4f25-4353-4b82-8de9-ee14a2f05076-test-operator-ephemeral-workdir\") pod \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.079930 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-ca-certs\") pod \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.080059 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9c3d4f25-4353-4b82-8de9-ee14a2f05076-openstack-config\") pod \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\" (UID: \"9c3d4f25-4353-4b82-8de9-ee14a2f05076\") " Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.079923 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3d4f25-4353-4b82-8de9-ee14a2f05076-config-data" (OuterVolumeSpecName: "config-data") pod "9c3d4f25-4353-4b82-8de9-ee14a2f05076" (UID: "9c3d4f25-4353-4b82-8de9-ee14a2f05076"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.080828 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c3d4f25-4353-4b82-8de9-ee14a2f05076-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "9c3d4f25-4353-4b82-8de9-ee14a2f05076" (UID: "9c3d4f25-4353-4b82-8de9-ee14a2f05076"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.081444 4776 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9c3d4f25-4353-4b82-8de9-ee14a2f05076-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.081471 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c3d4f25-4353-4b82-8de9-ee14a2f05076-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.084658 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "test-operator-logs") pod "9c3d4f25-4353-4b82-8de9-ee14a2f05076" (UID: "9c3d4f25-4353-4b82-8de9-ee14a2f05076"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.085403 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c3d4f25-4353-4b82-8de9-ee14a2f05076-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "9c3d4f25-4353-4b82-8de9-ee14a2f05076" (UID: "9c3d4f25-4353-4b82-8de9-ee14a2f05076"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.095306 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c3d4f25-4353-4b82-8de9-ee14a2f05076-kube-api-access-q8fp8" (OuterVolumeSpecName: "kube-api-access-q8fp8") pod "9c3d4f25-4353-4b82-8de9-ee14a2f05076" (UID: "9c3d4f25-4353-4b82-8de9-ee14a2f05076"). InnerVolumeSpecName "kube-api-access-q8fp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.113384 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9c3d4f25-4353-4b82-8de9-ee14a2f05076" (UID: "9c3d4f25-4353-4b82-8de9-ee14a2f05076"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.120475 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "9c3d4f25-4353-4b82-8de9-ee14a2f05076" (UID: "9c3d4f25-4353-4b82-8de9-ee14a2f05076"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.124702 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9c3d4f25-4353-4b82-8de9-ee14a2f05076" (UID: "9c3d4f25-4353-4b82-8de9-ee14a2f05076"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.141611 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3d4f25-4353-4b82-8de9-ee14a2f05076-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9c3d4f25-4353-4b82-8de9-ee14a2f05076" (UID: "9c3d4f25-4353-4b82-8de9-ee14a2f05076"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.183816 4776 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.183856 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9c3d4f25-4353-4b82-8de9-ee14a2f05076-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.183868 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.183881 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8fp8\" (UniqueName: \"kubernetes.io/projected/9c3d4f25-4353-4b82-8de9-ee14a2f05076-kube-api-access-q8fp8\") on node \"crc\" DevicePath \"\"" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.184426 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.184447 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9c3d4f25-4353-4b82-8de9-ee14a2f05076-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.184459 4776 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9c3d4f25-4353-4b82-8de9-ee14a2f05076-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.213976 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.287049 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.556128 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9c3d4f25-4353-4b82-8de9-ee14a2f05076","Type":"ContainerDied","Data":"a81d3bad193bc41d0862cc4e230258edaec8cc554e77f509de10211782b648c2"} Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.556165 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a81d3bad193bc41d0862cc4e230258edaec8cc554e77f509de10211782b648c2" Dec 08 10:33:05 crc kubenswrapper[4776]: I1208 10:33:05.556306 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.333893 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 08 10:33:12 crc kubenswrapper[4776]: E1208 10:33:12.335000 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3d4f25-4353-4b82-8de9-ee14a2f05076" containerName="tempest-tests-tempest-tests-runner" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.335018 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3d4f25-4353-4b82-8de9-ee14a2f05076" containerName="tempest-tests-tempest-tests-runner" Dec 08 10:33:12 crc kubenswrapper[4776]: E1208 10:33:12.335052 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58776d44-4160-43f0-afd9-00f3eb74cb92" containerName="extract-content" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.335060 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="58776d44-4160-43f0-afd9-00f3eb74cb92" containerName="extract-content" Dec 08 10:33:12 crc kubenswrapper[4776]: E1208 10:33:12.335085 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58776d44-4160-43f0-afd9-00f3eb74cb92" containerName="extract-utilities" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.335096 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="58776d44-4160-43f0-afd9-00f3eb74cb92" containerName="extract-utilities" Dec 08 10:33:12 crc kubenswrapper[4776]: E1208 10:33:12.335139 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58776d44-4160-43f0-afd9-00f3eb74cb92" containerName="registry-server" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.335146 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="58776d44-4160-43f0-afd9-00f3eb74cb92" containerName="registry-server" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.335467 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="58776d44-4160-43f0-afd9-00f3eb74cb92" containerName="registry-server" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.335510 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c3d4f25-4353-4b82-8de9-ee14a2f05076" containerName="tempest-tests-tempest-tests-runner" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.336602 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.339458 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8cddf" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.360307 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.475273 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnjw7\" (UniqueName: \"kubernetes.io/projected/dcfe5c37-ca0e-44d6-9051-bdf107f11cdb-kube-api-access-vnjw7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"dcfe5c37-ca0e-44d6-9051-bdf107f11cdb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.475317 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"dcfe5c37-ca0e-44d6-9051-bdf107f11cdb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.577797 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnjw7\" (UniqueName: \"kubernetes.io/projected/dcfe5c37-ca0e-44d6-9051-bdf107f11cdb-kube-api-access-vnjw7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"dcfe5c37-ca0e-44d6-9051-bdf107f11cdb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.577853 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"dcfe5c37-ca0e-44d6-9051-bdf107f11cdb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.579408 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"dcfe5c37-ca0e-44d6-9051-bdf107f11cdb\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.618746 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnjw7\" (UniqueName: \"kubernetes.io/projected/dcfe5c37-ca0e-44d6-9051-bdf107f11cdb-kube-api-access-vnjw7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"dcfe5c37-ca0e-44d6-9051-bdf107f11cdb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.645313 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"dcfe5c37-ca0e-44d6-9051-bdf107f11cdb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 10:33:12 crc kubenswrapper[4776]: I1208 10:33:12.678766 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 10:33:13 crc kubenswrapper[4776]: I1208 10:33:13.144978 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 08 10:33:13 crc kubenswrapper[4776]: I1208 10:33:13.657623 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"dcfe5c37-ca0e-44d6-9051-bdf107f11cdb","Type":"ContainerStarted","Data":"938a05d40b1313b2867ba6ae139a33f031396b8b5dcae05ccd53cba9631f5f5a"} Dec 08 10:33:14 crc kubenswrapper[4776]: I1208 10:33:14.676034 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"dcfe5c37-ca0e-44d6-9051-bdf107f11cdb","Type":"ContainerStarted","Data":"494b0e3f9d29d4c173122f8a2a7b21eba554caa8978f732ddc31e19dbb20446a"} Dec 08 10:33:14 crc kubenswrapper[4776]: I1208 10:33:14.701935 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.801099856 podStartE2EDuration="2.701912903s" podCreationTimestamp="2025-12-08 10:33:12 +0000 UTC" firstStartedPulling="2025-12-08 10:33:13.154620077 +0000 UTC m=+5669.417845099" lastFinishedPulling="2025-12-08 10:33:14.055433124 +0000 UTC m=+5670.318658146" observedRunningTime="2025-12-08 10:33:14.692588238 +0000 UTC m=+5670.955813260" watchObservedRunningTime="2025-12-08 10:33:14.701912903 +0000 UTC m=+5670.965137925" Dec 08 10:33:41 crc kubenswrapper[4776]: I1208 10:33:41.399629 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:33:41 crc kubenswrapper[4776]: I1208 10:33:41.400055 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:33:44 crc kubenswrapper[4776]: I1208 10:33:44.376653 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xxxxw/must-gather-jk9pj"] Dec 08 10:33:44 crc kubenswrapper[4776]: I1208 10:33:44.379551 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xxxxw/must-gather-jk9pj" Dec 08 10:33:44 crc kubenswrapper[4776]: I1208 10:33:44.382460 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xxxxw"/"kube-root-ca.crt" Dec 08 10:33:44 crc kubenswrapper[4776]: I1208 10:33:44.382557 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-xxxxw"/"default-dockercfg-s46fp" Dec 08 10:33:44 crc kubenswrapper[4776]: I1208 10:33:44.382702 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xxxxw"/"openshift-service-ca.crt" Dec 08 10:33:44 crc kubenswrapper[4776]: I1208 10:33:44.390577 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xxxxw/must-gather-jk9pj"] Dec 08 10:33:44 crc kubenswrapper[4776]: I1208 10:33:44.561950 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3d1b6d17-c87a-4518-ae10-0fd52d9a854e-must-gather-output\") pod \"must-gather-jk9pj\" (UID: \"3d1b6d17-c87a-4518-ae10-0fd52d9a854e\") " pod="openshift-must-gather-xxxxw/must-gather-jk9pj" Dec 08 10:33:44 crc kubenswrapper[4776]: I1208 10:33:44.562293 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdt4r\" (UniqueName: \"kubernetes.io/projected/3d1b6d17-c87a-4518-ae10-0fd52d9a854e-kube-api-access-sdt4r\") pod \"must-gather-jk9pj\" (UID: \"3d1b6d17-c87a-4518-ae10-0fd52d9a854e\") " pod="openshift-must-gather-xxxxw/must-gather-jk9pj" Dec 08 10:33:44 crc kubenswrapper[4776]: I1208 10:33:44.664739 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdt4r\" (UniqueName: \"kubernetes.io/projected/3d1b6d17-c87a-4518-ae10-0fd52d9a854e-kube-api-access-sdt4r\") pod \"must-gather-jk9pj\" (UID: \"3d1b6d17-c87a-4518-ae10-0fd52d9a854e\") " pod="openshift-must-gather-xxxxw/must-gather-jk9pj" Dec 08 10:33:44 crc kubenswrapper[4776]: I1208 10:33:44.664902 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3d1b6d17-c87a-4518-ae10-0fd52d9a854e-must-gather-output\") pod \"must-gather-jk9pj\" (UID: \"3d1b6d17-c87a-4518-ae10-0fd52d9a854e\") " pod="openshift-must-gather-xxxxw/must-gather-jk9pj" Dec 08 10:33:44 crc kubenswrapper[4776]: I1208 10:33:44.665391 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3d1b6d17-c87a-4518-ae10-0fd52d9a854e-must-gather-output\") pod \"must-gather-jk9pj\" (UID: \"3d1b6d17-c87a-4518-ae10-0fd52d9a854e\") " pod="openshift-must-gather-xxxxw/must-gather-jk9pj" Dec 08 10:33:44 crc kubenswrapper[4776]: I1208 10:33:44.694689 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdt4r\" (UniqueName: \"kubernetes.io/projected/3d1b6d17-c87a-4518-ae10-0fd52d9a854e-kube-api-access-sdt4r\") pod \"must-gather-jk9pj\" (UID: \"3d1b6d17-c87a-4518-ae10-0fd52d9a854e\") " pod="openshift-must-gather-xxxxw/must-gather-jk9pj" Dec 08 10:33:44 crc kubenswrapper[4776]: I1208 10:33:44.698923 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xxxxw/must-gather-jk9pj" Dec 08 10:33:45 crc kubenswrapper[4776]: I1208 10:33:45.213996 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xxxxw/must-gather-jk9pj"] Dec 08 10:33:46 crc kubenswrapper[4776]: I1208 10:33:46.019206 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xxxxw/must-gather-jk9pj" event={"ID":"3d1b6d17-c87a-4518-ae10-0fd52d9a854e","Type":"ContainerStarted","Data":"84b65c875b031a51cfb805663c4867bd59700c6abddb935fd202f5dc280d71dc"} Dec 08 10:33:53 crc kubenswrapper[4776]: I1208 10:33:53.199872 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xxxxw/must-gather-jk9pj" event={"ID":"3d1b6d17-c87a-4518-ae10-0fd52d9a854e","Type":"ContainerStarted","Data":"582b79ac9f273c883790a68c4f07210c8d49a6babb467e19e18665a96da4a485"} Dec 08 10:33:54 crc kubenswrapper[4776]: I1208 10:33:54.212625 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xxxxw/must-gather-jk9pj" event={"ID":"3d1b6d17-c87a-4518-ae10-0fd52d9a854e","Type":"ContainerStarted","Data":"8fa246deb9b833b81b62fc7546dc53e78644ee5e15d0c66b001103c9c32e939b"} Dec 08 10:33:54 crc kubenswrapper[4776]: I1208 10:33:54.238362 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xxxxw/must-gather-jk9pj" podStartSLOduration=2.631192152 podStartE2EDuration="10.238340555s" podCreationTimestamp="2025-12-08 10:33:44 +0000 UTC" firstStartedPulling="2025-12-08 10:33:45.220687342 +0000 UTC m=+5701.483912374" lastFinishedPulling="2025-12-08 10:33:52.827835755 +0000 UTC m=+5709.091060777" observedRunningTime="2025-12-08 10:33:54.227837768 +0000 UTC m=+5710.491062810" watchObservedRunningTime="2025-12-08 10:33:54.238340555 +0000 UTC m=+5710.501565577" Dec 08 10:33:58 crc kubenswrapper[4776]: I1208 10:33:58.076795 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xxxxw/crc-debug-xvvv7"] Dec 08 10:33:58 crc kubenswrapper[4776]: I1208 10:33:58.078732 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xxxxw/crc-debug-xvvv7" Dec 08 10:33:58 crc kubenswrapper[4776]: I1208 10:33:58.125764 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg54x\" (UniqueName: \"kubernetes.io/projected/68425fcc-8fa9-4fae-ac94-97510b4a4947-kube-api-access-kg54x\") pod \"crc-debug-xvvv7\" (UID: \"68425fcc-8fa9-4fae-ac94-97510b4a4947\") " pod="openshift-must-gather-xxxxw/crc-debug-xvvv7" Dec 08 10:33:58 crc kubenswrapper[4776]: I1208 10:33:58.125965 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68425fcc-8fa9-4fae-ac94-97510b4a4947-host\") pod \"crc-debug-xvvv7\" (UID: \"68425fcc-8fa9-4fae-ac94-97510b4a4947\") " pod="openshift-must-gather-xxxxw/crc-debug-xvvv7" Dec 08 10:33:58 crc kubenswrapper[4776]: I1208 10:33:58.228658 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg54x\" (UniqueName: \"kubernetes.io/projected/68425fcc-8fa9-4fae-ac94-97510b4a4947-kube-api-access-kg54x\") pod \"crc-debug-xvvv7\" (UID: \"68425fcc-8fa9-4fae-ac94-97510b4a4947\") " pod="openshift-must-gather-xxxxw/crc-debug-xvvv7" Dec 08 10:33:58 crc kubenswrapper[4776]: I1208 10:33:58.228811 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68425fcc-8fa9-4fae-ac94-97510b4a4947-host\") pod \"crc-debug-xvvv7\" (UID: \"68425fcc-8fa9-4fae-ac94-97510b4a4947\") " pod="openshift-must-gather-xxxxw/crc-debug-xvvv7" Dec 08 10:33:58 crc kubenswrapper[4776]: I1208 10:33:58.229311 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68425fcc-8fa9-4fae-ac94-97510b4a4947-host\") pod \"crc-debug-xvvv7\" (UID: \"68425fcc-8fa9-4fae-ac94-97510b4a4947\") " pod="openshift-must-gather-xxxxw/crc-debug-xvvv7" Dec 08 10:33:58 crc kubenswrapper[4776]: I1208 10:33:58.256075 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg54x\" (UniqueName: \"kubernetes.io/projected/68425fcc-8fa9-4fae-ac94-97510b4a4947-kube-api-access-kg54x\") pod \"crc-debug-xvvv7\" (UID: \"68425fcc-8fa9-4fae-ac94-97510b4a4947\") " pod="openshift-must-gather-xxxxw/crc-debug-xvvv7" Dec 08 10:33:58 crc kubenswrapper[4776]: I1208 10:33:58.400741 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xxxxw/crc-debug-xvvv7" Dec 08 10:33:59 crc kubenswrapper[4776]: I1208 10:33:59.266026 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xxxxw/crc-debug-xvvv7" event={"ID":"68425fcc-8fa9-4fae-ac94-97510b4a4947","Type":"ContainerStarted","Data":"950778be6fcdf0418a25f30fef672fd508a30e1f591f5b8204a007f8b0b31bfb"} Dec 08 10:34:10 crc kubenswrapper[4776]: I1208 10:34:10.395664 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xxxxw/crc-debug-xvvv7" event={"ID":"68425fcc-8fa9-4fae-ac94-97510b4a4947","Type":"ContainerStarted","Data":"48cd89e8bd75188efea96c620bb708bd06549285c9f5dd30eba16d2448de7387"} Dec 08 10:34:10 crc kubenswrapper[4776]: I1208 10:34:10.415014 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xxxxw/crc-debug-xvvv7" podStartSLOduration=1.500155084 podStartE2EDuration="12.414999289s" podCreationTimestamp="2025-12-08 10:33:58 +0000 UTC" firstStartedPulling="2025-12-08 10:33:58.4452693 +0000 UTC m=+5714.708494322" lastFinishedPulling="2025-12-08 10:34:09.360113505 +0000 UTC m=+5725.623338527" observedRunningTime="2025-12-08 10:34:10.413244802 +0000 UTC m=+5726.676469844" watchObservedRunningTime="2025-12-08 10:34:10.414999289 +0000 UTC m=+5726.678224311" Dec 08 10:34:11 crc kubenswrapper[4776]: I1208 10:34:11.398492 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:34:11 crc kubenswrapper[4776]: I1208 10:34:11.399074 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:34:41 crc kubenswrapper[4776]: I1208 10:34:41.399191 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:34:41 crc kubenswrapper[4776]: I1208 10:34:41.399806 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:34:41 crc kubenswrapper[4776]: I1208 10:34:41.399852 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 10:34:41 crc kubenswrapper[4776]: I1208 10:34:41.400458 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 10:34:41 crc kubenswrapper[4776]: I1208 10:34:41.400511 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" gracePeriod=600 Dec 08 10:34:41 crc kubenswrapper[4776]: E1208 10:34:41.530116 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:34:41 crc kubenswrapper[4776]: I1208 10:34:41.751402 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" exitCode=0 Dec 08 10:34:41 crc kubenswrapper[4776]: I1208 10:34:41.751442 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7"} Dec 08 10:34:41 crc kubenswrapper[4776]: I1208 10:34:41.751473 4776 scope.go:117] "RemoveContainer" containerID="64895e0778b6450b6c551f116e434d2513cfc42f72a2ea700de1a1fdd4e6f67b" Dec 08 10:34:41 crc kubenswrapper[4776]: I1208 10:34:41.752424 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:34:41 crc kubenswrapper[4776]: E1208 10:34:41.752701 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:34:56 crc kubenswrapper[4776]: I1208 10:34:56.344854 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:34:56 crc kubenswrapper[4776]: E1208 10:34:56.345761 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:35:01 crc kubenswrapper[4776]: I1208 10:35:01.996568 4776 generic.go:334] "Generic (PLEG): container finished" podID="68425fcc-8fa9-4fae-ac94-97510b4a4947" containerID="48cd89e8bd75188efea96c620bb708bd06549285c9f5dd30eba16d2448de7387" exitCode=0 Dec 08 10:35:01 crc kubenswrapper[4776]: I1208 10:35:01.996659 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xxxxw/crc-debug-xvvv7" event={"ID":"68425fcc-8fa9-4fae-ac94-97510b4a4947","Type":"ContainerDied","Data":"48cd89e8bd75188efea96c620bb708bd06549285c9f5dd30eba16d2448de7387"} Dec 08 10:35:03 crc kubenswrapper[4776]: I1208 10:35:03.859619 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xxxxw/crc-debug-xvvv7" Dec 08 10:35:03 crc kubenswrapper[4776]: I1208 10:35:03.900805 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xxxxw/crc-debug-xvvv7"] Dec 08 10:35:03 crc kubenswrapper[4776]: I1208 10:35:03.911198 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xxxxw/crc-debug-xvvv7"] Dec 08 10:35:03 crc kubenswrapper[4776]: I1208 10:35:03.961364 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68425fcc-8fa9-4fae-ac94-97510b4a4947-host\") pod \"68425fcc-8fa9-4fae-ac94-97510b4a4947\" (UID: \"68425fcc-8fa9-4fae-ac94-97510b4a4947\") " Dec 08 10:35:03 crc kubenswrapper[4776]: I1208 10:35:03.961563 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg54x\" (UniqueName: \"kubernetes.io/projected/68425fcc-8fa9-4fae-ac94-97510b4a4947-kube-api-access-kg54x\") pod \"68425fcc-8fa9-4fae-ac94-97510b4a4947\" (UID: \"68425fcc-8fa9-4fae-ac94-97510b4a4947\") " Dec 08 10:35:03 crc kubenswrapper[4776]: I1208 10:35:03.961605 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68425fcc-8fa9-4fae-ac94-97510b4a4947-host" (OuterVolumeSpecName: "host") pod "68425fcc-8fa9-4fae-ac94-97510b4a4947" (UID: "68425fcc-8fa9-4fae-ac94-97510b4a4947"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 10:35:03 crc kubenswrapper[4776]: I1208 10:35:03.962140 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68425fcc-8fa9-4fae-ac94-97510b4a4947-host\") on node \"crc\" DevicePath \"\"" Dec 08 10:35:03 crc kubenswrapper[4776]: I1208 10:35:03.968720 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68425fcc-8fa9-4fae-ac94-97510b4a4947-kube-api-access-kg54x" (OuterVolumeSpecName: "kube-api-access-kg54x") pod "68425fcc-8fa9-4fae-ac94-97510b4a4947" (UID: "68425fcc-8fa9-4fae-ac94-97510b4a4947"). InnerVolumeSpecName "kube-api-access-kg54x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:35:04 crc kubenswrapper[4776]: I1208 10:35:04.018194 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="950778be6fcdf0418a25f30fef672fd508a30e1f591f5b8204a007f8b0b31bfb" Dec 08 10:35:04 crc kubenswrapper[4776]: I1208 10:35:04.018246 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xxxxw/crc-debug-xvvv7" Dec 08 10:35:04 crc kubenswrapper[4776]: I1208 10:35:04.064601 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg54x\" (UniqueName: \"kubernetes.io/projected/68425fcc-8fa9-4fae-ac94-97510b4a4947-kube-api-access-kg54x\") on node \"crc\" DevicePath \"\"" Dec 08 10:35:04 crc kubenswrapper[4776]: I1208 10:35:04.357012 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68425fcc-8fa9-4fae-ac94-97510b4a4947" path="/var/lib/kubelet/pods/68425fcc-8fa9-4fae-ac94-97510b4a4947/volumes" Dec 08 10:35:05 crc kubenswrapper[4776]: I1208 10:35:05.088428 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xxxxw/crc-debug-2kt49"] Dec 08 10:35:05 crc kubenswrapper[4776]: E1208 10:35:05.088930 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68425fcc-8fa9-4fae-ac94-97510b4a4947" containerName="container-00" Dec 08 10:35:05 crc kubenswrapper[4776]: I1208 10:35:05.088943 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="68425fcc-8fa9-4fae-ac94-97510b4a4947" containerName="container-00" Dec 08 10:35:05 crc kubenswrapper[4776]: I1208 10:35:05.089149 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="68425fcc-8fa9-4fae-ac94-97510b4a4947" containerName="container-00" Dec 08 10:35:05 crc kubenswrapper[4776]: I1208 10:35:05.089970 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xxxxw/crc-debug-2kt49" Dec 08 10:35:05 crc kubenswrapper[4776]: I1208 10:35:05.188063 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d88v2\" (UniqueName: \"kubernetes.io/projected/c5deeb12-87e2-4a15-9776-b34db2374298-kube-api-access-d88v2\") pod \"crc-debug-2kt49\" (UID: \"c5deeb12-87e2-4a15-9776-b34db2374298\") " pod="openshift-must-gather-xxxxw/crc-debug-2kt49" Dec 08 10:35:05 crc kubenswrapper[4776]: I1208 10:35:05.188590 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5deeb12-87e2-4a15-9776-b34db2374298-host\") pod \"crc-debug-2kt49\" (UID: \"c5deeb12-87e2-4a15-9776-b34db2374298\") " pod="openshift-must-gather-xxxxw/crc-debug-2kt49" Dec 08 10:35:05 crc kubenswrapper[4776]: I1208 10:35:05.290482 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5deeb12-87e2-4a15-9776-b34db2374298-host\") pod \"crc-debug-2kt49\" (UID: \"c5deeb12-87e2-4a15-9776-b34db2374298\") " pod="openshift-must-gather-xxxxw/crc-debug-2kt49" Dec 08 10:35:05 crc kubenswrapper[4776]: I1208 10:35:05.290598 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d88v2\" (UniqueName: \"kubernetes.io/projected/c5deeb12-87e2-4a15-9776-b34db2374298-kube-api-access-d88v2\") pod \"crc-debug-2kt49\" (UID: \"c5deeb12-87e2-4a15-9776-b34db2374298\") " pod="openshift-must-gather-xxxxw/crc-debug-2kt49" Dec 08 10:35:05 crc kubenswrapper[4776]: I1208 10:35:05.290864 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5deeb12-87e2-4a15-9776-b34db2374298-host\") pod \"crc-debug-2kt49\" (UID: \"c5deeb12-87e2-4a15-9776-b34db2374298\") " pod="openshift-must-gather-xxxxw/crc-debug-2kt49" Dec 08 10:35:05 crc kubenswrapper[4776]: I1208 10:35:05.752712 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d88v2\" (UniqueName: \"kubernetes.io/projected/c5deeb12-87e2-4a15-9776-b34db2374298-kube-api-access-d88v2\") pod \"crc-debug-2kt49\" (UID: \"c5deeb12-87e2-4a15-9776-b34db2374298\") " pod="openshift-must-gather-xxxxw/crc-debug-2kt49" Dec 08 10:35:06 crc kubenswrapper[4776]: I1208 10:35:06.009910 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xxxxw/crc-debug-2kt49" Dec 08 10:35:07 crc kubenswrapper[4776]: I1208 10:35:07.048514 4776 generic.go:334] "Generic (PLEG): container finished" podID="c5deeb12-87e2-4a15-9776-b34db2374298" containerID="d2eff13474c0fdf262fd4d564cf6f37cd07e6b2f5bce2184a44f615010e88e0f" exitCode=0 Dec 08 10:35:07 crc kubenswrapper[4776]: I1208 10:35:07.048568 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xxxxw/crc-debug-2kt49" event={"ID":"c5deeb12-87e2-4a15-9776-b34db2374298","Type":"ContainerDied","Data":"d2eff13474c0fdf262fd4d564cf6f37cd07e6b2f5bce2184a44f615010e88e0f"} Dec 08 10:35:07 crc kubenswrapper[4776]: I1208 10:35:07.048601 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xxxxw/crc-debug-2kt49" event={"ID":"c5deeb12-87e2-4a15-9776-b34db2374298","Type":"ContainerStarted","Data":"c23a384e1984f18cb86ccd803958a9b4eb9fdc4ca4904aab00a8be2b8c7b80cd"} Dec 08 10:35:08 crc kubenswrapper[4776]: I1208 10:35:08.182563 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xxxxw/crc-debug-2kt49" Dec 08 10:35:08 crc kubenswrapper[4776]: I1208 10:35:08.259946 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5deeb12-87e2-4a15-9776-b34db2374298-host\") pod \"c5deeb12-87e2-4a15-9776-b34db2374298\" (UID: \"c5deeb12-87e2-4a15-9776-b34db2374298\") " Dec 08 10:35:08 crc kubenswrapper[4776]: I1208 10:35:08.260066 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5deeb12-87e2-4a15-9776-b34db2374298-host" (OuterVolumeSpecName: "host") pod "c5deeb12-87e2-4a15-9776-b34db2374298" (UID: "c5deeb12-87e2-4a15-9776-b34db2374298"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 10:35:08 crc kubenswrapper[4776]: I1208 10:35:08.260156 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d88v2\" (UniqueName: \"kubernetes.io/projected/c5deeb12-87e2-4a15-9776-b34db2374298-kube-api-access-d88v2\") pod \"c5deeb12-87e2-4a15-9776-b34db2374298\" (UID: \"c5deeb12-87e2-4a15-9776-b34db2374298\") " Dec 08 10:35:08 crc kubenswrapper[4776]: I1208 10:35:08.260646 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5deeb12-87e2-4a15-9776-b34db2374298-host\") on node \"crc\" DevicePath \"\"" Dec 08 10:35:08 crc kubenswrapper[4776]: I1208 10:35:08.265387 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5deeb12-87e2-4a15-9776-b34db2374298-kube-api-access-d88v2" (OuterVolumeSpecName: "kube-api-access-d88v2") pod "c5deeb12-87e2-4a15-9776-b34db2374298" (UID: "c5deeb12-87e2-4a15-9776-b34db2374298"). InnerVolumeSpecName "kube-api-access-d88v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:35:08 crc kubenswrapper[4776]: I1208 10:35:08.362214 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d88v2\" (UniqueName: \"kubernetes.io/projected/c5deeb12-87e2-4a15-9776-b34db2374298-kube-api-access-d88v2\") on node \"crc\" DevicePath \"\"" Dec 08 10:35:09 crc kubenswrapper[4776]: I1208 10:35:09.068215 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xxxxw/crc-debug-2kt49" event={"ID":"c5deeb12-87e2-4a15-9776-b34db2374298","Type":"ContainerDied","Data":"c23a384e1984f18cb86ccd803958a9b4eb9fdc4ca4904aab00a8be2b8c7b80cd"} Dec 08 10:35:09 crc kubenswrapper[4776]: I1208 10:35:09.068249 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xxxxw/crc-debug-2kt49" Dec 08 10:35:09 crc kubenswrapper[4776]: I1208 10:35:09.068260 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c23a384e1984f18cb86ccd803958a9b4eb9fdc4ca4904aab00a8be2b8c7b80cd" Dec 08 10:35:09 crc kubenswrapper[4776]: I1208 10:35:09.378452 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xxxxw/crc-debug-2kt49"] Dec 08 10:35:09 crc kubenswrapper[4776]: I1208 10:35:09.389672 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xxxxw/crc-debug-2kt49"] Dec 08 10:35:10 crc kubenswrapper[4776]: I1208 10:35:10.360102 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5deeb12-87e2-4a15-9776-b34db2374298" path="/var/lib/kubelet/pods/c5deeb12-87e2-4a15-9776-b34db2374298/volumes" Dec 08 10:35:10 crc kubenswrapper[4776]: I1208 10:35:10.560202 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xxxxw/crc-debug-4ktgw"] Dec 08 10:35:10 crc kubenswrapper[4776]: E1208 10:35:10.560669 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5deeb12-87e2-4a15-9776-b34db2374298" containerName="container-00" Dec 08 10:35:10 crc kubenswrapper[4776]: I1208 10:35:10.560682 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5deeb12-87e2-4a15-9776-b34db2374298" containerName="container-00" Dec 08 10:35:10 crc kubenswrapper[4776]: I1208 10:35:10.560951 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5deeb12-87e2-4a15-9776-b34db2374298" containerName="container-00" Dec 08 10:35:10 crc kubenswrapper[4776]: I1208 10:35:10.561812 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xxxxw/crc-debug-4ktgw" Dec 08 10:35:10 crc kubenswrapper[4776]: I1208 10:35:10.614424 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4vm9\" (UniqueName: \"kubernetes.io/projected/bf3587bd-3efc-4de5-9d37-ca0fdab10e13-kube-api-access-s4vm9\") pod \"crc-debug-4ktgw\" (UID: \"bf3587bd-3efc-4de5-9d37-ca0fdab10e13\") " pod="openshift-must-gather-xxxxw/crc-debug-4ktgw" Dec 08 10:35:10 crc kubenswrapper[4776]: I1208 10:35:10.615249 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf3587bd-3efc-4de5-9d37-ca0fdab10e13-host\") pod \"crc-debug-4ktgw\" (UID: \"bf3587bd-3efc-4de5-9d37-ca0fdab10e13\") " pod="openshift-must-gather-xxxxw/crc-debug-4ktgw" Dec 08 10:35:10 crc kubenswrapper[4776]: I1208 10:35:10.718611 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf3587bd-3efc-4de5-9d37-ca0fdab10e13-host\") pod \"crc-debug-4ktgw\" (UID: \"bf3587bd-3efc-4de5-9d37-ca0fdab10e13\") " pod="openshift-must-gather-xxxxw/crc-debug-4ktgw" Dec 08 10:35:10 crc kubenswrapper[4776]: I1208 10:35:10.718748 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf3587bd-3efc-4de5-9d37-ca0fdab10e13-host\") pod \"crc-debug-4ktgw\" (UID: \"bf3587bd-3efc-4de5-9d37-ca0fdab10e13\") " pod="openshift-must-gather-xxxxw/crc-debug-4ktgw" Dec 08 10:35:10 crc kubenswrapper[4776]: I1208 10:35:10.718945 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4vm9\" (UniqueName: \"kubernetes.io/projected/bf3587bd-3efc-4de5-9d37-ca0fdab10e13-kube-api-access-s4vm9\") pod \"crc-debug-4ktgw\" (UID: \"bf3587bd-3efc-4de5-9d37-ca0fdab10e13\") " pod="openshift-must-gather-xxxxw/crc-debug-4ktgw" Dec 08 10:35:10 crc kubenswrapper[4776]: I1208 10:35:10.735798 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4vm9\" (UniqueName: \"kubernetes.io/projected/bf3587bd-3efc-4de5-9d37-ca0fdab10e13-kube-api-access-s4vm9\") pod \"crc-debug-4ktgw\" (UID: \"bf3587bd-3efc-4de5-9d37-ca0fdab10e13\") " pod="openshift-must-gather-xxxxw/crc-debug-4ktgw" Dec 08 10:35:10 crc kubenswrapper[4776]: I1208 10:35:10.881974 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xxxxw/crc-debug-4ktgw" Dec 08 10:35:10 crc kubenswrapper[4776]: W1208 10:35:10.912640 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf3587bd_3efc_4de5_9d37_ca0fdab10e13.slice/crio-ed3e44b9fb7dc3cd3577ed1608aa1c0ae6eacda1632494dcf5a7e534e8dd9467 WatchSource:0}: Error finding container ed3e44b9fb7dc3cd3577ed1608aa1c0ae6eacda1632494dcf5a7e534e8dd9467: Status 404 returned error can't find the container with id ed3e44b9fb7dc3cd3577ed1608aa1c0ae6eacda1632494dcf5a7e534e8dd9467 Dec 08 10:35:11 crc kubenswrapper[4776]: I1208 10:35:11.087999 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xxxxw/crc-debug-4ktgw" event={"ID":"bf3587bd-3efc-4de5-9d37-ca0fdab10e13","Type":"ContainerStarted","Data":"ed3e44b9fb7dc3cd3577ed1608aa1c0ae6eacda1632494dcf5a7e534e8dd9467"} Dec 08 10:35:11 crc kubenswrapper[4776]: I1208 10:35:11.343972 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:35:11 crc kubenswrapper[4776]: E1208 10:35:11.344448 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:35:12 crc kubenswrapper[4776]: I1208 10:35:12.099980 4776 generic.go:334] "Generic (PLEG): container finished" podID="bf3587bd-3efc-4de5-9d37-ca0fdab10e13" containerID="f83e83330e111731d5950297c7e4cb6bd1edfd224d466b1a097596e98e40035b" exitCode=0 Dec 08 10:35:12 crc kubenswrapper[4776]: I1208 10:35:12.100094 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xxxxw/crc-debug-4ktgw" event={"ID":"bf3587bd-3efc-4de5-9d37-ca0fdab10e13","Type":"ContainerDied","Data":"f83e83330e111731d5950297c7e4cb6bd1edfd224d466b1a097596e98e40035b"} Dec 08 10:35:12 crc kubenswrapper[4776]: I1208 10:35:12.142068 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xxxxw/crc-debug-4ktgw"] Dec 08 10:35:12 crc kubenswrapper[4776]: I1208 10:35:12.152543 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xxxxw/crc-debug-4ktgw"] Dec 08 10:35:13 crc kubenswrapper[4776]: I1208 10:35:13.268533 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xxxxw/crc-debug-4ktgw" Dec 08 10:35:13 crc kubenswrapper[4776]: I1208 10:35:13.390899 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf3587bd-3efc-4de5-9d37-ca0fdab10e13-host\") pod \"bf3587bd-3efc-4de5-9d37-ca0fdab10e13\" (UID: \"bf3587bd-3efc-4de5-9d37-ca0fdab10e13\") " Dec 08 10:35:13 crc kubenswrapper[4776]: I1208 10:35:13.391048 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf3587bd-3efc-4de5-9d37-ca0fdab10e13-host" (OuterVolumeSpecName: "host") pod "bf3587bd-3efc-4de5-9d37-ca0fdab10e13" (UID: "bf3587bd-3efc-4de5-9d37-ca0fdab10e13"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 10:35:13 crc kubenswrapper[4776]: I1208 10:35:13.391327 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4vm9\" (UniqueName: \"kubernetes.io/projected/bf3587bd-3efc-4de5-9d37-ca0fdab10e13-kube-api-access-s4vm9\") pod \"bf3587bd-3efc-4de5-9d37-ca0fdab10e13\" (UID: \"bf3587bd-3efc-4de5-9d37-ca0fdab10e13\") " Dec 08 10:35:13 crc kubenswrapper[4776]: I1208 10:35:13.392515 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf3587bd-3efc-4de5-9d37-ca0fdab10e13-host\") on node \"crc\" DevicePath \"\"" Dec 08 10:35:13 crc kubenswrapper[4776]: I1208 10:35:13.397682 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3587bd-3efc-4de5-9d37-ca0fdab10e13-kube-api-access-s4vm9" (OuterVolumeSpecName: "kube-api-access-s4vm9") pod "bf3587bd-3efc-4de5-9d37-ca0fdab10e13" (UID: "bf3587bd-3efc-4de5-9d37-ca0fdab10e13"). InnerVolumeSpecName "kube-api-access-s4vm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:35:13 crc kubenswrapper[4776]: I1208 10:35:13.494617 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4vm9\" (UniqueName: \"kubernetes.io/projected/bf3587bd-3efc-4de5-9d37-ca0fdab10e13-kube-api-access-s4vm9\") on node \"crc\" DevicePath \"\"" Dec 08 10:35:14 crc kubenswrapper[4776]: I1208 10:35:14.132550 4776 scope.go:117] "RemoveContainer" containerID="f83e83330e111731d5950297c7e4cb6bd1edfd224d466b1a097596e98e40035b" Dec 08 10:35:14 crc kubenswrapper[4776]: I1208 10:35:14.132597 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xxxxw/crc-debug-4ktgw" Dec 08 10:35:14 crc kubenswrapper[4776]: I1208 10:35:14.357107 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf3587bd-3efc-4de5-9d37-ca0fdab10e13" path="/var/lib/kubelet/pods/bf3587bd-3efc-4de5-9d37-ca0fdab10e13/volumes" Dec 08 10:35:24 crc kubenswrapper[4776]: I1208 10:35:24.352534 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:35:24 crc kubenswrapper[4776]: E1208 10:35:24.353701 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:35:37 crc kubenswrapper[4776]: I1208 10:35:37.343514 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:35:37 crc kubenswrapper[4776]: E1208 10:35:37.344137 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:35:40 crc kubenswrapper[4776]: I1208 10:35:40.328667 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a024b29b-cad1-489c-88ea-efc9558b2da0/aodh-api/0.log" Dec 08 10:35:40 crc kubenswrapper[4776]: I1208 10:35:40.415508 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a024b29b-cad1-489c-88ea-efc9558b2da0/aodh-evaluator/0.log" Dec 08 10:35:40 crc kubenswrapper[4776]: I1208 10:35:40.774753 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d7dd8bd8b-9z2p2_71c29885-fdf1-4500-bee7-2b4102fb2c7e/barbican-api/0.log" Dec 08 10:35:40 crc kubenswrapper[4776]: I1208 10:35:40.817704 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a024b29b-cad1-489c-88ea-efc9558b2da0/aodh-listener/0.log" Dec 08 10:35:40 crc kubenswrapper[4776]: I1208 10:35:40.820925 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a024b29b-cad1-489c-88ea-efc9558b2da0/aodh-notifier/0.log" Dec 08 10:35:41 crc kubenswrapper[4776]: I1208 10:35:41.061839 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d7dd8bd8b-9z2p2_71c29885-fdf1-4500-bee7-2b4102fb2c7e/barbican-api-log/0.log" Dec 08 10:35:41 crc kubenswrapper[4776]: I1208 10:35:41.068660 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78d78f694b-ck9wf_e501058f-25e0-456c-b23d-c7caafa729c3/barbican-keystone-listener/0.log" Dec 08 10:35:41 crc kubenswrapper[4776]: I1208 10:35:41.304511 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78d78f694b-ck9wf_e501058f-25e0-456c-b23d-c7caafa729c3/barbican-keystone-listener-log/0.log" Dec 08 10:35:41 crc kubenswrapper[4776]: I1208 10:35:41.327984 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-664c575c59-ncvpr_d6258a3d-a50e-4cf4-af4d-e6f588d8744a/barbican-worker/0.log" Dec 08 10:35:41 crc kubenswrapper[4776]: I1208 10:35:41.366680 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-664c575c59-ncvpr_d6258a3d-a50e-4cf4-af4d-e6f588d8744a/barbican-worker-log/0.log" Dec 08 10:35:41 crc kubenswrapper[4776]: I1208 10:35:41.542274 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4_2304e249-86bc-4b0a-a222-e8c2ba39a0bb/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:41 crc kubenswrapper[4776]: I1208 10:35:41.724264 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7cf1c3e-6789-4ccd-894c-946f056f2d96/ceilometer-central-agent/0.log" Dec 08 10:35:41 crc kubenswrapper[4776]: I1208 10:35:41.820759 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7cf1c3e-6789-4ccd-894c-946f056f2d96/ceilometer-notification-agent/0.log" Dec 08 10:35:41 crc kubenswrapper[4776]: I1208 10:35:41.890132 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7cf1c3e-6789-4ccd-894c-946f056f2d96/proxy-httpd/0.log" Dec 08 10:35:41 crc kubenswrapper[4776]: I1208 10:35:41.959425 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7cf1c3e-6789-4ccd-894c-946f056f2d96/sg-core/0.log" Dec 08 10:35:42 crc kubenswrapper[4776]: I1208 10:35:42.115265 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dcb1d701-bc05-4d4b-8794-ebc4af6da8ba/cinder-api-log/0.log" Dec 08 10:35:42 crc kubenswrapper[4776]: I1208 10:35:42.135348 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dcb1d701-bc05-4d4b-8794-ebc4af6da8ba/cinder-api/0.log" Dec 08 10:35:42 crc kubenswrapper[4776]: I1208 10:35:42.303118 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_289a9d84-e76a-42e5-9524-7e9b244b8743/cinder-scheduler/0.log" Dec 08 10:35:42 crc kubenswrapper[4776]: I1208 10:35:42.356762 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_289a9d84-e76a-42e5-9524-7e9b244b8743/probe/0.log" Dec 08 10:35:42 crc kubenswrapper[4776]: I1208 10:35:42.518335 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2_6367602c-669d-474f-bd56-97c1b58659b4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:42 crc kubenswrapper[4776]: I1208 10:35:42.668471 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7_c92123e3-056d-4e4f-83b1-3cf335342a70/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:42 crc kubenswrapper[4776]: I1208 10:35:42.749273 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-zwnwt_569f45d2-4634-4246-873e-939ec98a0baf/init/0.log" Dec 08 10:35:42 crc kubenswrapper[4776]: I1208 10:35:42.912898 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-zwnwt_569f45d2-4634-4246-873e-939ec98a0baf/init/0.log" Dec 08 10:35:42 crc kubenswrapper[4776]: I1208 10:35:42.980490 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-zwnwt_569f45d2-4634-4246-873e-939ec98a0baf/dnsmasq-dns/0.log" Dec 08 10:35:43 crc kubenswrapper[4776]: I1208 10:35:43.037674 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-75cmb_7af7dfaf-3db0-4c5d-b7fc-671893276afc/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:43 crc kubenswrapper[4776]: I1208 10:35:43.208602 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8f9758b1-4ae1-47ae-8a45-14b0df4c8632/glance-httpd/0.log" Dec 08 10:35:43 crc kubenswrapper[4776]: I1208 10:35:43.246944 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8f9758b1-4ae1-47ae-8a45-14b0df4c8632/glance-log/0.log" Dec 08 10:35:43 crc kubenswrapper[4776]: I1208 10:35:43.434813 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_28ffab6e-5596-4c63-b58a-4417489fc47b/glance-httpd/0.log" Dec 08 10:35:43 crc kubenswrapper[4776]: I1208 10:35:43.462010 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_28ffab6e-5596-4c63-b58a-4417489fc47b/glance-log/0.log" Dec 08 10:35:43 crc kubenswrapper[4776]: I1208 10:35:43.937602 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-589b85487f-7v8kk_5c891ff5-fbcf-46b6-bace-6ef62df3c0b9/heat-engine/0.log" Dec 08 10:35:44 crc kubenswrapper[4776]: I1208 10:35:44.178560 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml_7527bd54-54ba-42e5-9ec0-7037536864b9/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:44 crc kubenswrapper[4776]: I1208 10:35:44.222971 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-55fd4bf697-njsxk_b7f47153-4e65-48b9-816d-4c83b0b0d8a4/heat-api/0.log" Dec 08 10:35:44 crc kubenswrapper[4776]: I1208 10:35:44.240909 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-tknt7_f67c7d60-bc4d-4712-a8d9-acb48e097264/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:44 crc kubenswrapper[4776]: I1208 10:35:44.393004 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-5fdf94c698-qz6j8_5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4/heat-cfnapi/0.log" Dec 08 10:35:44 crc kubenswrapper[4776]: I1208 10:35:44.504078 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29419801-s7tdk_53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e/keystone-cron/0.log" Dec 08 10:35:44 crc kubenswrapper[4776]: I1208 10:35:44.679264 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_90e6d887-db3e-40c6-9411-0e2565e5994d/kube-state-metrics/0.log" Dec 08 10:35:44 crc kubenswrapper[4776]: I1208 10:35:44.836083 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-28blw_3933dc31-4df5-46ec-8fe0-62b9771c5515/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:44 crc kubenswrapper[4776]: I1208 10:35:44.902952 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-77496dd4f7-8gxmg_6207b5a9-d7b8-4302-876c-c2a84bb352a1/keystone-api/0.log" Dec 08 10:35:44 crc kubenswrapper[4776]: I1208 10:35:44.941798 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-7kf4c_4e957285-89ac-4a08-a5f9-a3199e19b787/logging-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:45 crc kubenswrapper[4776]: I1208 10:35:45.393686 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_4cce6b19-9d40-4957-8154-b4d3a50fe2f7/mysqld-exporter/0.log" Dec 08 10:35:45 crc kubenswrapper[4776]: I1208 10:35:45.758094 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt_30f7ff02-8887-44e7-a223-335cd93255ef/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:45 crc kubenswrapper[4776]: I1208 10:35:45.803024 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5fd69d7-r446k_f84e2e46-bb9f-4b55-afd1-683f365c5417/neutron-httpd/0.log" Dec 08 10:35:45 crc kubenswrapper[4776]: I1208 10:35:45.866712 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5fd69d7-r446k_f84e2e46-bb9f-4b55-afd1-683f365c5417/neutron-api/0.log" Dec 08 10:35:46 crc kubenswrapper[4776]: I1208 10:35:46.360274 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ffbcb3b3-c4d6-461f-bae8-c1ae2de20050/nova-cell0-conductor-conductor/0.log" Dec 08 10:35:46 crc kubenswrapper[4776]: I1208 10:35:46.623845 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_56c71de4-c00f-47d6-87d7-c5eb97b88eef/nova-api-log/0.log" Dec 08 10:35:46 crc kubenswrapper[4776]: I1208 10:35:46.677379 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9791ac59-89ef-4429-b797-d89d7ce62024/nova-cell1-conductor-conductor/0.log" Dec 08 10:35:46 crc kubenswrapper[4776]: I1208 10:35:46.957718 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f485895b-f2aa-427f-b592-811f09089a49/nova-cell1-novncproxy-novncproxy/0.log" Dec 08 10:35:47 crc kubenswrapper[4776]: I1208 10:35:47.011986 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-tnlnp_9cd841cc-611f-406b-b9d5-8c242c1321ba/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:47 crc kubenswrapper[4776]: I1208 10:35:47.062711 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_56c71de4-c00f-47d6-87d7-c5eb97b88eef/nova-api-api/0.log" Dec 08 10:35:47 crc kubenswrapper[4776]: I1208 10:35:47.279462 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e6c2fb50-f70b-43cc-a493-b4ffa4292c64/nova-metadata-log/0.log" Dec 08 10:35:47 crc kubenswrapper[4776]: I1208 10:35:47.536082 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e7651697-0db7-476f-8b50-1f04771b4ed2/nova-scheduler-scheduler/0.log" Dec 08 10:35:47 crc kubenswrapper[4776]: I1208 10:35:47.618054 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_425d947a-2a85-4a03-853f-a60f54515a57/mysql-bootstrap/0.log" Dec 08 10:35:47 crc kubenswrapper[4776]: I1208 10:35:47.783462 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_425d947a-2a85-4a03-853f-a60f54515a57/mysql-bootstrap/0.log" Dec 08 10:35:47 crc kubenswrapper[4776]: I1208 10:35:47.866745 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_425d947a-2a85-4a03-853f-a60f54515a57/galera/0.log" Dec 08 10:35:48 crc kubenswrapper[4776]: I1208 10:35:48.008860 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7df4120e-0e93-4000-8b6a-7823f3e89dac/mysql-bootstrap/0.log" Dec 08 10:35:48 crc kubenswrapper[4776]: I1208 10:35:48.234518 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7df4120e-0e93-4000-8b6a-7823f3e89dac/mysql-bootstrap/0.log" Dec 08 10:35:48 crc kubenswrapper[4776]: I1208 10:35:48.242854 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7df4120e-0e93-4000-8b6a-7823f3e89dac/galera/0.log" Dec 08 10:35:48 crc kubenswrapper[4776]: I1208 10:35:48.441738 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8606b034-7364-4dce-bea0-7c0e2067ee95/openstackclient/0.log" Dec 08 10:35:48 crc kubenswrapper[4776]: I1208 10:35:48.503137 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-zn4qk_e843ce72-b4b1-4603-8876-05dc121793ed/openstack-network-exporter/0.log" Dec 08 10:35:48 crc kubenswrapper[4776]: I1208 10:35:48.744839 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5tfbk_215a9444-a545-491d-9eb6-02d98baff784/ovsdb-server-init/0.log" Dec 08 10:35:48 crc kubenswrapper[4776]: I1208 10:35:48.872637 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5tfbk_215a9444-a545-491d-9eb6-02d98baff784/ovsdb-server-init/0.log" Dec 08 10:35:48 crc kubenswrapper[4776]: I1208 10:35:48.956541 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5tfbk_215a9444-a545-491d-9eb6-02d98baff784/ovs-vswitchd/0.log" Dec 08 10:35:48 crc kubenswrapper[4776]: I1208 10:35:48.973494 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5tfbk_215a9444-a545-491d-9eb6-02d98baff784/ovsdb-server/0.log" Dec 08 10:35:49 crc kubenswrapper[4776]: I1208 10:35:49.246368 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-wpgmk_9a9a1b68-ec7e-4994-9bda-fd418747dbc5/ovn-controller/0.log" Dec 08 10:35:49 crc kubenswrapper[4776]: I1208 10:35:49.348879 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e6c2fb50-f70b-43cc-a493-b4ffa4292c64/nova-metadata-metadata/0.log" Dec 08 10:35:49 crc kubenswrapper[4776]: I1208 10:35:49.419369 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-k4c4r_abe6fd93-f916-47f2-854e-fa4d908fa9ad/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:49 crc kubenswrapper[4776]: I1208 10:35:49.559403 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_96dd2435-6c8f-4ac2-9b72-43f82d2eeb52/ovn-northd/0.log" Dec 08 10:35:49 crc kubenswrapper[4776]: I1208 10:35:49.572300 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_96dd2435-6c8f-4ac2-9b72-43f82d2eeb52/openstack-network-exporter/0.log" Dec 08 10:35:49 crc kubenswrapper[4776]: I1208 10:35:49.797496 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0e4de746-d269-470c-b934-117aa4c73834/ovsdbserver-nb/0.log" Dec 08 10:35:49 crc kubenswrapper[4776]: I1208 10:35:49.816727 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0e4de746-d269-470c-b934-117aa4c73834/openstack-network-exporter/0.log" Dec 08 10:35:50 crc kubenswrapper[4776]: I1208 10:35:50.162790 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3d941bbc-2271-4ec4-853f-57feaf6ace36/openstack-network-exporter/0.log" Dec 08 10:35:50 crc kubenswrapper[4776]: I1208 10:35:50.282089 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3d941bbc-2271-4ec4-853f-57feaf6ace36/ovsdbserver-sb/0.log" Dec 08 10:35:50 crc kubenswrapper[4776]: I1208 10:35:50.378770 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-75876fb99b-xnbd7_ae330c18-0140-4bc4-8503-cf6c3bbce3d8/placement-api/0.log" Dec 08 10:35:50 crc kubenswrapper[4776]: I1208 10:35:50.544520 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-75876fb99b-xnbd7_ae330c18-0140-4bc4-8503-cf6c3bbce3d8/placement-log/0.log" Dec 08 10:35:50 crc kubenswrapper[4776]: I1208 10:35:50.615447 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_95be142a-2a8f-4f5c-97e0-2e64e108fb8b/init-config-reloader/0.log" Dec 08 10:35:50 crc kubenswrapper[4776]: I1208 10:35:50.741659 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_95be142a-2a8f-4f5c-97e0-2e64e108fb8b/init-config-reloader/0.log" Dec 08 10:35:50 crc kubenswrapper[4776]: I1208 10:35:50.748123 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_95be142a-2a8f-4f5c-97e0-2e64e108fb8b/config-reloader/0.log" Dec 08 10:35:50 crc kubenswrapper[4776]: I1208 10:35:50.826499 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_95be142a-2a8f-4f5c-97e0-2e64e108fb8b/thanos-sidecar/0.log" Dec 08 10:35:50 crc kubenswrapper[4776]: I1208 10:35:50.890232 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_95be142a-2a8f-4f5c-97e0-2e64e108fb8b/prometheus/0.log" Dec 08 10:35:50 crc kubenswrapper[4776]: I1208 10:35:50.986890 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_87931091-7230-4451-9d94-20ac4b8458bc/setup-container/0.log" Dec 08 10:35:51 crc kubenswrapper[4776]: I1208 10:35:51.273439 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_87931091-7230-4451-9d94-20ac4b8458bc/setup-container/0.log" Dec 08 10:35:51 crc kubenswrapper[4776]: I1208 10:35:51.288664 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_87931091-7230-4451-9d94-20ac4b8458bc/rabbitmq/0.log" Dec 08 10:35:51 crc kubenswrapper[4776]: I1208 10:35:51.343842 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:35:51 crc kubenswrapper[4776]: E1208 10:35:51.344261 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:35:51 crc kubenswrapper[4776]: I1208 10:35:51.682730 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab6303ff-9104-40ed-babe-1445f4cd89e2/setup-container/0.log" Dec 08 10:35:51 crc kubenswrapper[4776]: I1208 10:35:51.841234 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab6303ff-9104-40ed-babe-1445f4cd89e2/setup-container/0.log" Dec 08 10:35:51 crc kubenswrapper[4776]: I1208 10:35:51.877207 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab6303ff-9104-40ed-babe-1445f4cd89e2/rabbitmq/0.log" Dec 08 10:35:52 crc kubenswrapper[4776]: I1208 10:35:52.015029 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n_26d6a987-fa87-4870-97f8-30aa5b38b753/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:52 crc kubenswrapper[4776]: I1208 10:35:52.042057 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-xbms7_9419c01b-956b-4781-a8bf-e2e1472ad2cf/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:52 crc kubenswrapper[4776]: I1208 10:35:52.251636 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-97xln_31f822a4-fa31-4cae-b24f-a1c1395caf05/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:52 crc kubenswrapper[4776]: I1208 10:35:52.269382 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq_8b119f36-1ae0-4826-8043-4e038e4398a3/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:52 crc kubenswrapper[4776]: I1208 10:35:52.532072 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-cxgqb_60899add-1d95-4fad-8cee-852951046a90/ssh-known-hosts-edpm-deployment/0.log" Dec 08 10:35:52 crc kubenswrapper[4776]: I1208 10:35:52.722265 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5cfbc4c5f-hhnf9_aa5389fb-4ae8-45b1-baaf-18f2fea3f61c/proxy-httpd/0.log" Dec 08 10:35:52 crc kubenswrapper[4776]: I1208 10:35:52.751481 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5cfbc4c5f-hhnf9_aa5389fb-4ae8-45b1-baaf-18f2fea3f61c/proxy-server/0.log" Dec 08 10:35:52 crc kubenswrapper[4776]: I1208 10:35:52.753623 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mmp8z_0436afba-d4b2-47d8-ac4d-c621e029333d/swift-ring-rebalance/0.log" Dec 08 10:35:52 crc kubenswrapper[4776]: I1208 10:35:52.917340 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/account-auditor/0.log" Dec 08 10:35:52 crc kubenswrapper[4776]: I1208 10:35:52.979733 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/account-reaper/0.log" Dec 08 10:35:53 crc kubenswrapper[4776]: I1208 10:35:53.077279 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/account-replicator/0.log" Dec 08 10:35:53 crc kubenswrapper[4776]: I1208 10:35:53.152012 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/container-auditor/0.log" Dec 08 10:35:53 crc kubenswrapper[4776]: I1208 10:35:53.190282 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/account-server/0.log" Dec 08 10:35:53 crc kubenswrapper[4776]: I1208 10:35:53.303791 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/container-server/0.log" Dec 08 10:35:53 crc kubenswrapper[4776]: I1208 10:35:53.365443 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/container-replicator/0.log" Dec 08 10:35:53 crc kubenswrapper[4776]: I1208 10:35:53.413258 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/container-updater/0.log" Dec 08 10:35:53 crc kubenswrapper[4776]: I1208 10:35:53.473771 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/object-auditor/0.log" Dec 08 10:35:53 crc kubenswrapper[4776]: I1208 10:35:53.548270 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/object-expirer/0.log" Dec 08 10:35:53 crc kubenswrapper[4776]: I1208 10:35:53.652189 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/object-server/0.log" Dec 08 10:35:53 crc kubenswrapper[4776]: I1208 10:35:53.663146 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/object-replicator/0.log" Dec 08 10:35:53 crc kubenswrapper[4776]: I1208 10:35:53.718031 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/object-updater/0.log" Dec 08 10:35:53 crc kubenswrapper[4776]: I1208 10:35:53.786536 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/rsync/0.log" Dec 08 10:35:53 crc kubenswrapper[4776]: I1208 10:35:53.926363 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/swift-recon-cron/0.log" Dec 08 10:35:53 crc kubenswrapper[4776]: I1208 10:35:53.982323 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-89ghm_b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:54 crc kubenswrapper[4776]: I1208 10:35:54.216373 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp_18a0027c-b2f9-4c57-9f94-30b31659d298/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:54 crc kubenswrapper[4776]: I1208 10:35:54.566143 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_dcfe5c37-ca0e-44d6-9051-bdf107f11cdb/test-operator-logs-container/0.log" Dec 08 10:35:54 crc kubenswrapper[4776]: I1208 10:35:54.831362 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-js6sf_d8284a3c-c72c-41f5-aefe-bbc881bf969b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:35:55 crc kubenswrapper[4776]: I1208 10:35:55.095347 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_9c3d4f25-4353-4b82-8de9-ee14a2f05076/tempest-tests-tempest-tests-runner/0.log" Dec 08 10:35:56 crc kubenswrapper[4776]: I1208 10:35:56.438550 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gvx6k"] Dec 08 10:35:56 crc kubenswrapper[4776]: E1208 10:35:56.439369 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3587bd-3efc-4de5-9d37-ca0fdab10e13" containerName="container-00" Dec 08 10:35:56 crc kubenswrapper[4776]: I1208 10:35:56.439382 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3587bd-3efc-4de5-9d37-ca0fdab10e13" containerName="container-00" Dec 08 10:35:56 crc kubenswrapper[4776]: I1208 10:35:56.439659 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3587bd-3efc-4de5-9d37-ca0fdab10e13" containerName="container-00" Dec 08 10:35:56 crc kubenswrapper[4776]: I1208 10:35:56.441428 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:35:56 crc kubenswrapper[4776]: I1208 10:35:56.491636 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvx6k"] Dec 08 10:35:56 crc kubenswrapper[4776]: I1208 10:35:56.540299 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/636c3ebf-3c4e-47c3-b02c-663079e58d09-catalog-content\") pod \"redhat-marketplace-gvx6k\" (UID: \"636c3ebf-3c4e-47c3-b02c-663079e58d09\") " pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:35:56 crc kubenswrapper[4776]: I1208 10:35:56.540342 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/636c3ebf-3c4e-47c3-b02c-663079e58d09-utilities\") pod \"redhat-marketplace-gvx6k\" (UID: \"636c3ebf-3c4e-47c3-b02c-663079e58d09\") " pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:35:56 crc kubenswrapper[4776]: I1208 10:35:56.540376 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t6qx\" (UniqueName: \"kubernetes.io/projected/636c3ebf-3c4e-47c3-b02c-663079e58d09-kube-api-access-2t6qx\") pod \"redhat-marketplace-gvx6k\" (UID: \"636c3ebf-3c4e-47c3-b02c-663079e58d09\") " pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:35:56 crc kubenswrapper[4776]: I1208 10:35:56.643004 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/636c3ebf-3c4e-47c3-b02c-663079e58d09-catalog-content\") pod \"redhat-marketplace-gvx6k\" (UID: \"636c3ebf-3c4e-47c3-b02c-663079e58d09\") " pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:35:56 crc kubenswrapper[4776]: I1208 10:35:56.643047 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/636c3ebf-3c4e-47c3-b02c-663079e58d09-utilities\") pod \"redhat-marketplace-gvx6k\" (UID: \"636c3ebf-3c4e-47c3-b02c-663079e58d09\") " pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:35:56 crc kubenswrapper[4776]: I1208 10:35:56.643081 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t6qx\" (UniqueName: \"kubernetes.io/projected/636c3ebf-3c4e-47c3-b02c-663079e58d09-kube-api-access-2t6qx\") pod \"redhat-marketplace-gvx6k\" (UID: \"636c3ebf-3c4e-47c3-b02c-663079e58d09\") " pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:35:56 crc kubenswrapper[4776]: I1208 10:35:56.643488 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/636c3ebf-3c4e-47c3-b02c-663079e58d09-catalog-content\") pod \"redhat-marketplace-gvx6k\" (UID: \"636c3ebf-3c4e-47c3-b02c-663079e58d09\") " pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:35:56 crc kubenswrapper[4776]: I1208 10:35:56.643796 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/636c3ebf-3c4e-47c3-b02c-663079e58d09-utilities\") pod \"redhat-marketplace-gvx6k\" (UID: \"636c3ebf-3c4e-47c3-b02c-663079e58d09\") " pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:35:56 crc kubenswrapper[4776]: I1208 10:35:56.665510 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t6qx\" (UniqueName: \"kubernetes.io/projected/636c3ebf-3c4e-47c3-b02c-663079e58d09-kube-api-access-2t6qx\") pod \"redhat-marketplace-gvx6k\" (UID: \"636c3ebf-3c4e-47c3-b02c-663079e58d09\") " pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:35:56 crc kubenswrapper[4776]: I1208 10:35:56.795314 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:35:57 crc kubenswrapper[4776]: I1208 10:35:57.420562 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvx6k"] Dec 08 10:35:57 crc kubenswrapper[4776]: I1208 10:35:57.593737 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvx6k" event={"ID":"636c3ebf-3c4e-47c3-b02c-663079e58d09","Type":"ContainerStarted","Data":"461074fda99292380fea4d16aeff5a1b5f14683cb153483813c980487c49a28c"} Dec 08 10:35:57 crc kubenswrapper[4776]: I1208 10:35:57.902818 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_981d14af-244f-4679-975d-58e11df95718/memcached/0.log" Dec 08 10:35:58 crc kubenswrapper[4776]: I1208 10:35:58.604784 4776 generic.go:334] "Generic (PLEG): container finished" podID="636c3ebf-3c4e-47c3-b02c-663079e58d09" containerID="25972dc483146bb16ecc2db034455e7fbf74bdbd15f948a0be9df41839f9cdc1" exitCode=0 Dec 08 10:35:58 crc kubenswrapper[4776]: I1208 10:35:58.604868 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvx6k" event={"ID":"636c3ebf-3c4e-47c3-b02c-663079e58d09","Type":"ContainerDied","Data":"25972dc483146bb16ecc2db034455e7fbf74bdbd15f948a0be9df41839f9cdc1"} Dec 08 10:35:58 crc kubenswrapper[4776]: I1208 10:35:58.607625 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 10:35:59 crc kubenswrapper[4776]: I1208 10:35:59.626948 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvx6k" event={"ID":"636c3ebf-3c4e-47c3-b02c-663079e58d09","Type":"ContainerStarted","Data":"810d6574e4dde082a00f93a1b4ee2e9e2a70609155d1543bd7f4e8b88df416f7"} Dec 08 10:36:00 crc kubenswrapper[4776]: I1208 10:36:00.640414 4776 generic.go:334] "Generic (PLEG): container finished" podID="636c3ebf-3c4e-47c3-b02c-663079e58d09" containerID="810d6574e4dde082a00f93a1b4ee2e9e2a70609155d1543bd7f4e8b88df416f7" exitCode=0 Dec 08 10:36:00 crc kubenswrapper[4776]: I1208 10:36:00.640486 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvx6k" event={"ID":"636c3ebf-3c4e-47c3-b02c-663079e58d09","Type":"ContainerDied","Data":"810d6574e4dde082a00f93a1b4ee2e9e2a70609155d1543bd7f4e8b88df416f7"} Dec 08 10:36:01 crc kubenswrapper[4776]: I1208 10:36:01.659678 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvx6k" event={"ID":"636c3ebf-3c4e-47c3-b02c-663079e58d09","Type":"ContainerStarted","Data":"fe4d3385f1a608e9bfd20b3bda697e9d589b6064555023437df68d01bb69df36"} Dec 08 10:36:01 crc kubenswrapper[4776]: I1208 10:36:01.683546 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gvx6k" podStartSLOduration=3.279168786 podStartE2EDuration="5.683525203s" podCreationTimestamp="2025-12-08 10:35:56 +0000 UTC" firstStartedPulling="2025-12-08 10:35:58.607332304 +0000 UTC m=+5834.870557326" lastFinishedPulling="2025-12-08 10:36:01.011688731 +0000 UTC m=+5837.274913743" observedRunningTime="2025-12-08 10:36:01.678001622 +0000 UTC m=+5837.941226634" watchObservedRunningTime="2025-12-08 10:36:01.683525203 +0000 UTC m=+5837.946750225" Dec 08 10:36:05 crc kubenswrapper[4776]: I1208 10:36:05.344379 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:36:05 crc kubenswrapper[4776]: E1208 10:36:05.346363 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:36:06 crc kubenswrapper[4776]: I1208 10:36:06.799699 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:36:06 crc kubenswrapper[4776]: I1208 10:36:06.800294 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:36:06 crc kubenswrapper[4776]: I1208 10:36:06.862996 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:36:07 crc kubenswrapper[4776]: I1208 10:36:07.774252 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:36:07 crc kubenswrapper[4776]: I1208 10:36:07.846946 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvx6k"] Dec 08 10:36:09 crc kubenswrapper[4776]: I1208 10:36:09.734690 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gvx6k" podUID="636c3ebf-3c4e-47c3-b02c-663079e58d09" containerName="registry-server" containerID="cri-o://fe4d3385f1a608e9bfd20b3bda697e9d589b6064555023437df68d01bb69df36" gracePeriod=2 Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.314452 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.373578 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/636c3ebf-3c4e-47c3-b02c-663079e58d09-catalog-content\") pod \"636c3ebf-3c4e-47c3-b02c-663079e58d09\" (UID: \"636c3ebf-3c4e-47c3-b02c-663079e58d09\") " Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.374013 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/636c3ebf-3c4e-47c3-b02c-663079e58d09-utilities\") pod \"636c3ebf-3c4e-47c3-b02c-663079e58d09\" (UID: \"636c3ebf-3c4e-47c3-b02c-663079e58d09\") " Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.374077 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t6qx\" (UniqueName: \"kubernetes.io/projected/636c3ebf-3c4e-47c3-b02c-663079e58d09-kube-api-access-2t6qx\") pod \"636c3ebf-3c4e-47c3-b02c-663079e58d09\" (UID: \"636c3ebf-3c4e-47c3-b02c-663079e58d09\") " Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.374837 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/636c3ebf-3c4e-47c3-b02c-663079e58d09-utilities" (OuterVolumeSpecName: "utilities") pod "636c3ebf-3c4e-47c3-b02c-663079e58d09" (UID: "636c3ebf-3c4e-47c3-b02c-663079e58d09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.382119 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/636c3ebf-3c4e-47c3-b02c-663079e58d09-kube-api-access-2t6qx" (OuterVolumeSpecName: "kube-api-access-2t6qx") pod "636c3ebf-3c4e-47c3-b02c-663079e58d09" (UID: "636c3ebf-3c4e-47c3-b02c-663079e58d09"). InnerVolumeSpecName "kube-api-access-2t6qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.399224 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/636c3ebf-3c4e-47c3-b02c-663079e58d09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "636c3ebf-3c4e-47c3-b02c-663079e58d09" (UID: "636c3ebf-3c4e-47c3-b02c-663079e58d09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.477420 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t6qx\" (UniqueName: \"kubernetes.io/projected/636c3ebf-3c4e-47c3-b02c-663079e58d09-kube-api-access-2t6qx\") on node \"crc\" DevicePath \"\"" Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.477454 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/636c3ebf-3c4e-47c3-b02c-663079e58d09-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.477463 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/636c3ebf-3c4e-47c3-b02c-663079e58d09-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.750297 4776 generic.go:334] "Generic (PLEG): container finished" podID="636c3ebf-3c4e-47c3-b02c-663079e58d09" containerID="fe4d3385f1a608e9bfd20b3bda697e9d589b6064555023437df68d01bb69df36" exitCode=0 Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.750532 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvx6k" event={"ID":"636c3ebf-3c4e-47c3-b02c-663079e58d09","Type":"ContainerDied","Data":"fe4d3385f1a608e9bfd20b3bda697e9d589b6064555023437df68d01bb69df36"} Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.750574 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvx6k" event={"ID":"636c3ebf-3c4e-47c3-b02c-663079e58d09","Type":"ContainerDied","Data":"461074fda99292380fea4d16aeff5a1b5f14683cb153483813c980487c49a28c"} Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.750594 4776 scope.go:117] "RemoveContainer" containerID="fe4d3385f1a608e9bfd20b3bda697e9d589b6064555023437df68d01bb69df36" Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.750507 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvx6k" Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.790287 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvx6k"] Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.790666 4776 scope.go:117] "RemoveContainer" containerID="810d6574e4dde082a00f93a1b4ee2e9e2a70609155d1543bd7f4e8b88df416f7" Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.802147 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvx6k"] Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.815681 4776 scope.go:117] "RemoveContainer" containerID="25972dc483146bb16ecc2db034455e7fbf74bdbd15f948a0be9df41839f9cdc1" Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.872943 4776 scope.go:117] "RemoveContainer" containerID="fe4d3385f1a608e9bfd20b3bda697e9d589b6064555023437df68d01bb69df36" Dec 08 10:36:10 crc kubenswrapper[4776]: E1208 10:36:10.873418 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe4d3385f1a608e9bfd20b3bda697e9d589b6064555023437df68d01bb69df36\": container with ID starting with fe4d3385f1a608e9bfd20b3bda697e9d589b6064555023437df68d01bb69df36 not found: ID does not exist" containerID="fe4d3385f1a608e9bfd20b3bda697e9d589b6064555023437df68d01bb69df36" Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.873461 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe4d3385f1a608e9bfd20b3bda697e9d589b6064555023437df68d01bb69df36"} err="failed to get container status \"fe4d3385f1a608e9bfd20b3bda697e9d589b6064555023437df68d01bb69df36\": rpc error: code = NotFound desc = could not find container \"fe4d3385f1a608e9bfd20b3bda697e9d589b6064555023437df68d01bb69df36\": container with ID starting with fe4d3385f1a608e9bfd20b3bda697e9d589b6064555023437df68d01bb69df36 not found: ID does not exist" Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.873483 4776 scope.go:117] "RemoveContainer" containerID="810d6574e4dde082a00f93a1b4ee2e9e2a70609155d1543bd7f4e8b88df416f7" Dec 08 10:36:10 crc kubenswrapper[4776]: E1208 10:36:10.873745 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"810d6574e4dde082a00f93a1b4ee2e9e2a70609155d1543bd7f4e8b88df416f7\": container with ID starting with 810d6574e4dde082a00f93a1b4ee2e9e2a70609155d1543bd7f4e8b88df416f7 not found: ID does not exist" containerID="810d6574e4dde082a00f93a1b4ee2e9e2a70609155d1543bd7f4e8b88df416f7" Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.873792 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"810d6574e4dde082a00f93a1b4ee2e9e2a70609155d1543bd7f4e8b88df416f7"} err="failed to get container status \"810d6574e4dde082a00f93a1b4ee2e9e2a70609155d1543bd7f4e8b88df416f7\": rpc error: code = NotFound desc = could not find container \"810d6574e4dde082a00f93a1b4ee2e9e2a70609155d1543bd7f4e8b88df416f7\": container with ID starting with 810d6574e4dde082a00f93a1b4ee2e9e2a70609155d1543bd7f4e8b88df416f7 not found: ID does not exist" Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.873819 4776 scope.go:117] "RemoveContainer" containerID="25972dc483146bb16ecc2db034455e7fbf74bdbd15f948a0be9df41839f9cdc1" Dec 08 10:36:10 crc kubenswrapper[4776]: E1208 10:36:10.874212 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25972dc483146bb16ecc2db034455e7fbf74bdbd15f948a0be9df41839f9cdc1\": container with ID starting with 25972dc483146bb16ecc2db034455e7fbf74bdbd15f948a0be9df41839f9cdc1 not found: ID does not exist" containerID="25972dc483146bb16ecc2db034455e7fbf74bdbd15f948a0be9df41839f9cdc1" Dec 08 10:36:10 crc kubenswrapper[4776]: I1208 10:36:10.874241 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25972dc483146bb16ecc2db034455e7fbf74bdbd15f948a0be9df41839f9cdc1"} err="failed to get container status \"25972dc483146bb16ecc2db034455e7fbf74bdbd15f948a0be9df41839f9cdc1\": rpc error: code = NotFound desc = could not find container \"25972dc483146bb16ecc2db034455e7fbf74bdbd15f948a0be9df41839f9cdc1\": container with ID starting with 25972dc483146bb16ecc2db034455e7fbf74bdbd15f948a0be9df41839f9cdc1 not found: ID does not exist" Dec 08 10:36:12 crc kubenswrapper[4776]: I1208 10:36:12.356583 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="636c3ebf-3c4e-47c3-b02c-663079e58d09" path="/var/lib/kubelet/pods/636c3ebf-3c4e-47c3-b02c-663079e58d09/volumes" Dec 08 10:36:19 crc kubenswrapper[4776]: I1208 10:36:19.343751 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:36:19 crc kubenswrapper[4776]: E1208 10:36:19.344434 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:36:20 crc kubenswrapper[4776]: I1208 10:36:20.564077 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67_d003bbac-1fa9-4696-aded-39e4b8d211ff/util/0.log" Dec 08 10:36:20 crc kubenswrapper[4776]: I1208 10:36:20.785552 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67_d003bbac-1fa9-4696-aded-39e4b8d211ff/pull/0.log" Dec 08 10:36:20 crc kubenswrapper[4776]: I1208 10:36:20.810595 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67_d003bbac-1fa9-4696-aded-39e4b8d211ff/util/0.log" Dec 08 10:36:20 crc kubenswrapper[4776]: I1208 10:36:20.830736 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67_d003bbac-1fa9-4696-aded-39e4b8d211ff/pull/0.log" Dec 08 10:36:21 crc kubenswrapper[4776]: I1208 10:36:21.004272 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67_d003bbac-1fa9-4696-aded-39e4b8d211ff/pull/0.log" Dec 08 10:36:21 crc kubenswrapper[4776]: I1208 10:36:21.043539 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67_d003bbac-1fa9-4696-aded-39e4b8d211ff/util/0.log" Dec 08 10:36:21 crc kubenswrapper[4776]: I1208 10:36:21.051206 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67_d003bbac-1fa9-4696-aded-39e4b8d211ff/extract/0.log" Dec 08 10:36:21 crc kubenswrapper[4776]: I1208 10:36:21.218911 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rgfzz_b39e8644-6fb7-4d7c-a623-c0eadac0e896/kube-rbac-proxy/0.log" Dec 08 10:36:21 crc kubenswrapper[4776]: I1208 10:36:21.266073 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-2g4ph_ad1d3b70-6eea-46a4-bdc1-82144fe12f4a/kube-rbac-proxy/0.log" Dec 08 10:36:21 crc kubenswrapper[4776]: I1208 10:36:21.281785 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rgfzz_b39e8644-6fb7-4d7c-a623-c0eadac0e896/manager/0.log" Dec 08 10:36:21 crc kubenswrapper[4776]: I1208 10:36:21.424447 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-2g4ph_ad1d3b70-6eea-46a4-bdc1-82144fe12f4a/manager/0.log" Dec 08 10:36:21 crc kubenswrapper[4776]: I1208 10:36:21.470360 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-ftb4x_316c9728-ccef-4981-9903-895ab86e6616/kube-rbac-proxy/0.log" Dec 08 10:36:21 crc kubenswrapper[4776]: I1208 10:36:21.490142 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-ftb4x_316c9728-ccef-4981-9903-895ab86e6616/manager/0.log" Dec 08 10:36:21 crc kubenswrapper[4776]: I1208 10:36:21.634348 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-897nd_bb123983-a71d-4eca-84e8-6c116cc9b3b6/kube-rbac-proxy/0.log" Dec 08 10:36:21 crc kubenswrapper[4776]: I1208 10:36:21.703980 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-897nd_bb123983-a71d-4eca-84e8-6c116cc9b3b6/manager/0.log" Dec 08 10:36:21 crc kubenswrapper[4776]: I1208 10:36:21.837904 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-f2bnk_f85d592d-d82d-4c08-aafb-e9a7e68ef386/kube-rbac-proxy/0.log" Dec 08 10:36:21 crc kubenswrapper[4776]: I1208 10:36:21.944649 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-f2bnk_f85d592d-d82d-4c08-aafb-e9a7e68ef386/manager/0.log" Dec 08 10:36:22 crc kubenswrapper[4776]: I1208 10:36:22.143585 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-4k8qf_beadb3ee-3cd9-4c83-ba1f-9f599cd24940/kube-rbac-proxy/0.log" Dec 08 10:36:22 crc kubenswrapper[4776]: I1208 10:36:22.256317 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-4k8qf_beadb3ee-3cd9-4c83-ba1f-9f599cd24940/manager/0.log" Dec 08 10:36:22 crc kubenswrapper[4776]: I1208 10:36:22.293587 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-87pfw_dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0/kube-rbac-proxy/0.log" Dec 08 10:36:22 crc kubenswrapper[4776]: I1208 10:36:22.507450 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-4dj2x_422088d1-15c7-4791-b0c9-a12a2c5e2880/manager/0.log" Dec 08 10:36:22 crc kubenswrapper[4776]: I1208 10:36:22.530817 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-4dj2x_422088d1-15c7-4791-b0c9-a12a2c5e2880/kube-rbac-proxy/0.log" Dec 08 10:36:22 crc kubenswrapper[4776]: I1208 10:36:22.583982 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-87pfw_dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0/manager/0.log" Dec 08 10:36:22 crc kubenswrapper[4776]: I1208 10:36:22.766233 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-7smkr_0f590af7-17bd-46c4-8a25-ba3a368c6382/manager/0.log" Dec 08 10:36:22 crc kubenswrapper[4776]: I1208 10:36:22.777357 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-7smkr_0f590af7-17bd-46c4-8a25-ba3a368c6382/kube-rbac-proxy/0.log" Dec 08 10:36:22 crc kubenswrapper[4776]: I1208 10:36:22.960316 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-g66m2_ff110975-7e1d-4d6d-bd10-b666cd8fe98b/manager/0.log" Dec 08 10:36:22 crc kubenswrapper[4776]: I1208 10:36:22.965514 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-g66m2_ff110975-7e1d-4d6d-bd10-b666cd8fe98b/kube-rbac-proxy/0.log" Dec 08 10:36:23 crc kubenswrapper[4776]: I1208 10:36:23.020766 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-jgmdb_2a4ffe83-5f4d-4a7a-a2b6-64d12bd8f3f9/kube-rbac-proxy/0.log" Dec 08 10:36:23 crc kubenswrapper[4776]: I1208 10:36:23.153201 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-jgmdb_2a4ffe83-5f4d-4a7a-a2b6-64d12bd8f3f9/manager/0.log" Dec 08 10:36:23 crc kubenswrapper[4776]: I1208 10:36:23.171402 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-dqgnv_288a9127-92ed-4b19-8cc5-34b1f9b51201/kube-rbac-proxy/0.log" Dec 08 10:36:23 crc kubenswrapper[4776]: I1208 10:36:23.258590 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-dqgnv_288a9127-92ed-4b19-8cc5-34b1f9b51201/manager/0.log" Dec 08 10:36:23 crc kubenswrapper[4776]: I1208 10:36:23.367954 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-k928c_545c7a23-3539-4923-bd9e-8d64700070b5/kube-rbac-proxy/0.log" Dec 08 10:36:23 crc kubenswrapper[4776]: I1208 10:36:23.470902 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-k928c_545c7a23-3539-4923-bd9e-8d64700070b5/manager/0.log" Dec 08 10:36:23 crc kubenswrapper[4776]: I1208 10:36:23.571409 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-l979f_d07c95ca-1871-4ba3-81e5-c7b4d86bb0f4/kube-rbac-proxy/0.log" Dec 08 10:36:23 crc kubenswrapper[4776]: I1208 10:36:23.619816 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-l979f_d07c95ca-1871-4ba3-81e5-c7b4d86bb0f4/manager/0.log" Dec 08 10:36:23 crc kubenswrapper[4776]: I1208 10:36:23.650717 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879f7smn5_0cb0505b-eb0f-4801-841d-8a96fe29e608/kube-rbac-proxy/0.log" Dec 08 10:36:23 crc kubenswrapper[4776]: I1208 10:36:23.773778 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879f7smn5_0cb0505b-eb0f-4801-841d-8a96fe29e608/manager/0.log" Dec 08 10:36:24 crc kubenswrapper[4776]: I1208 10:36:24.134114 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5546b8686f-m7kf9_90449ceb-bf22-41c3-a66a-3f01c6e46edc/operator/0.log" Dec 08 10:36:24 crc kubenswrapper[4776]: I1208 10:36:24.154490 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-ndvrw_4bb0bbd1-4377-4f99-b0f3-e657e4c2a792/registry-server/0.log" Dec 08 10:36:24 crc kubenswrapper[4776]: I1208 10:36:24.375117 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-gk9xw_8cd2dc5d-1433-4660-9d65-bf49d398415f/kube-rbac-proxy/0.log" Dec 08 10:36:24 crc kubenswrapper[4776]: I1208 10:36:24.411918 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-gk9xw_8cd2dc5d-1433-4660-9d65-bf49d398415f/manager/0.log" Dec 08 10:36:24 crc kubenswrapper[4776]: I1208 10:36:24.645263 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-mdm5f_482e5641-8a00-4fc3-b7d3-6eb88dbee1e4/kube-rbac-proxy/0.log" Dec 08 10:36:24 crc kubenswrapper[4776]: I1208 10:36:24.727331 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-mdm5f_482e5641-8a00-4fc3-b7d3-6eb88dbee1e4/manager/0.log" Dec 08 10:36:24 crc kubenswrapper[4776]: I1208 10:36:24.813808 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xxv7g_d8a1143b-5dc6-4a99-a6e4-f155585ebbcb/operator/0.log" Dec 08 10:36:24 crc kubenswrapper[4776]: I1208 10:36:24.980623 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-ncfrf_6ea3ffdd-a922-487e-a738-da3091a1656e/kube-rbac-proxy/0.log" Dec 08 10:36:25 crc kubenswrapper[4776]: I1208 10:36:25.012827 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-ncfrf_6ea3ffdd-a922-487e-a738-da3091a1656e/manager/0.log" Dec 08 10:36:25 crc kubenswrapper[4776]: I1208 10:36:25.203201 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-68f9cdc5f7-scgrq_7134ec23-7ec3-454d-b837-29fbe7094067/kube-rbac-proxy/0.log" Dec 08 10:36:25 crc kubenswrapper[4776]: I1208 10:36:25.234560 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-57686cd5df-zt7pj_e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23/manager/0.log" Dec 08 10:36:25 crc kubenswrapper[4776]: I1208 10:36:25.368208 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-xtf2f_c8f3f832-68f1-47a2-bb3d-5d67f54655ce/manager/0.log" Dec 08 10:36:25 crc kubenswrapper[4776]: I1208 10:36:25.408602 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-xtf2f_c8f3f832-68f1-47a2-bb3d-5d67f54655ce/kube-rbac-proxy/0.log" Dec 08 10:36:25 crc kubenswrapper[4776]: I1208 10:36:25.512845 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-68f9cdc5f7-scgrq_7134ec23-7ec3-454d-b837-29fbe7094067/manager/0.log" Dec 08 10:36:25 crc kubenswrapper[4776]: I1208 10:36:25.559118 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-kfz2m_61424c2d-bdc7-431a-8f12-535e1e97ce4b/kube-rbac-proxy/0.log" Dec 08 10:36:25 crc kubenswrapper[4776]: I1208 10:36:25.567864 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-kfz2m_61424c2d-bdc7-431a-8f12-535e1e97ce4b/manager/0.log" Dec 08 10:36:31 crc kubenswrapper[4776]: I1208 10:36:31.344334 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:36:31 crc kubenswrapper[4776]: E1208 10:36:31.345127 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:36:43 crc kubenswrapper[4776]: I1208 10:36:43.861952 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-njv2h_35911247-ad00-422c-9d30-586834a80f76/control-plane-machine-set-operator/0.log" Dec 08 10:36:44 crc kubenswrapper[4776]: I1208 10:36:44.092959 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jxkd8_5ea3906e-d311-4b90-80be-7405507e135e/kube-rbac-proxy/0.log" Dec 08 10:36:44 crc kubenswrapper[4776]: I1208 10:36:44.100738 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jxkd8_5ea3906e-d311-4b90-80be-7405507e135e/machine-api-operator/0.log" Dec 08 10:36:45 crc kubenswrapper[4776]: I1208 10:36:45.344258 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:36:45 crc kubenswrapper[4776]: E1208 10:36:45.344774 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:36:57 crc kubenswrapper[4776]: I1208 10:36:57.979835 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-hsqbv_69b03c85-8503-44fc-9e71-0357ce0cc56e/cert-manager-controller/0.log" Dec 08 10:36:58 crc kubenswrapper[4776]: I1208 10:36:58.131258 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-6vnhc_8b23f8e2-638b-438a-8363-8daf30f656e6/cert-manager-cainjector/0.log" Dec 08 10:36:58 crc kubenswrapper[4776]: I1208 10:36:58.200391 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-6jjjz_4f663316-a0ef-44bd-a068-47f3e7d37a5c/cert-manager-webhook/0.log" Dec 08 10:36:59 crc kubenswrapper[4776]: I1208 10:36:59.344417 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:36:59 crc kubenswrapper[4776]: E1208 10:36:59.344962 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:37:10 crc kubenswrapper[4776]: I1208 10:37:10.729764 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-8ds4l_c2ba126f-aa28-4cfa-8aed-a9221e094a58/nmstate-console-plugin/0.log" Dec 08 10:37:10 crc kubenswrapper[4776]: I1208 10:37:10.937006 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-b8ckr_5e0ef761-506d-4695-b58a-128a6f5f7957/nmstate-handler/0.log" Dec 08 10:37:10 crc kubenswrapper[4776]: I1208 10:37:10.980898 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-lpbj6_b26e1190-f68a-487b-a2de-e0116525ab64/kube-rbac-proxy/0.log" Dec 08 10:37:11 crc kubenswrapper[4776]: I1208 10:37:11.061553 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-lpbj6_b26e1190-f68a-487b-a2de-e0116525ab64/nmstate-metrics/0.log" Dec 08 10:37:11 crc kubenswrapper[4776]: I1208 10:37:11.168133 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-n9gb5_5322a22f-cb6b-45df-af5f-395b2180a64b/nmstate-operator/0.log" Dec 08 10:37:11 crc kubenswrapper[4776]: I1208 10:37:11.294531 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-69dpx_e59b99c1-c9b8-4127-93db-933dddb3ebab/nmstate-webhook/0.log" Dec 08 10:37:14 crc kubenswrapper[4776]: I1208 10:37:14.351406 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:37:14 crc kubenswrapper[4776]: E1208 10:37:14.352226 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:37:23 crc kubenswrapper[4776]: I1208 10:37:23.501859 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6bfc99889d-tq44h_9ba0d9e5-f1ab-40a6-9490-57ce8566843a/manager/0.log" Dec 08 10:37:23 crc kubenswrapper[4776]: I1208 10:37:23.508741 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6bfc99889d-tq44h_9ba0d9e5-f1ab-40a6-9490-57ce8566843a/kube-rbac-proxy/0.log" Dec 08 10:37:29 crc kubenswrapper[4776]: I1208 10:37:29.344609 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:37:29 crc kubenswrapper[4776]: E1208 10:37:29.345577 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:37:37 crc kubenswrapper[4776]: I1208 10:37:37.541683 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-wwf9p_3e4917d5-3292-4cd8-b001-6d6bf5609def/cluster-logging-operator/0.log" Dec 08 10:37:37 crc kubenswrapper[4776]: I1208 10:37:37.697763 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-2xfbc_847cd111-98c5-4c39-bc29-1ba2bcdf570c/collector/0.log" Dec 08 10:37:37 crc kubenswrapper[4776]: I1208 10:37:37.776793 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_62fa460d-4457-4db0-8be1-d7fa62fd7144/loki-compactor/0.log" Dec 08 10:37:38 crc kubenswrapper[4776]: I1208 10:37:38.095508 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-w6292_9edcc5bd-cefb-4c32-89e3-24ff105358b2/loki-distributor/0.log" Dec 08 10:37:38 crc kubenswrapper[4776]: I1208 10:37:38.150664 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-54b997fdcc-9dpbh_11400c14-964a-494f-80da-d878c6d2a50d/gateway/0.log" Dec 08 10:37:38 crc kubenswrapper[4776]: I1208 10:37:38.258298 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-54b997fdcc-9dpbh_11400c14-964a-494f-80da-d878c6d2a50d/opa/0.log" Dec 08 10:37:38 crc kubenswrapper[4776]: I1208 10:37:38.306952 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-54b997fdcc-nmww6_d64b61be-4212-49da-9497-f567efa53a45/gateway/0.log" Dec 08 10:37:38 crc kubenswrapper[4776]: I1208 10:37:38.328584 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-54b997fdcc-nmww6_d64b61be-4212-49da-9497-f567efa53a45/opa/0.log" Dec 08 10:37:38 crc kubenswrapper[4776]: I1208 10:37:38.485481 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_1e9dd934-eb37-463c-890d-1021bbdc4e3f/loki-index-gateway/0.log" Dec 08 10:37:38 crc kubenswrapper[4776]: I1208 10:37:38.599609 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_c27c1242-5109-4547-8276-2dea60fad775/loki-ingester/0.log" Dec 08 10:37:38 crc kubenswrapper[4776]: I1208 10:37:38.729930 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-hwrdk_71eade59-504f-4431-8cd8-531883c1eba7/loki-querier/0.log" Dec 08 10:37:38 crc kubenswrapper[4776]: I1208 10:37:38.789706 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-fcz8j_aed8f23a-7437-4eab-8dae-6ff17f9a5aa0/loki-query-frontend/0.log" Dec 08 10:37:41 crc kubenswrapper[4776]: I1208 10:37:41.344581 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:37:41 crc kubenswrapper[4776]: E1208 10:37:41.345239 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:37:52 crc kubenswrapper[4776]: I1208 10:37:52.371034 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-gfgfc_f696bcbd-7230-43d3-b09e-645d489eacf3/kube-rbac-proxy/0.log" Dec 08 10:37:52 crc kubenswrapper[4776]: I1208 10:37:52.474051 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-gfgfc_f696bcbd-7230-43d3-b09e-645d489eacf3/controller/0.log" Dec 08 10:37:52 crc kubenswrapper[4776]: I1208 10:37:52.603144 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-frr-files/0.log" Dec 08 10:37:52 crc kubenswrapper[4776]: I1208 10:37:52.763596 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-reloader/0.log" Dec 08 10:37:52 crc kubenswrapper[4776]: I1208 10:37:52.791936 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-frr-files/0.log" Dec 08 10:37:52 crc kubenswrapper[4776]: I1208 10:37:52.792629 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-reloader/0.log" Dec 08 10:37:52 crc kubenswrapper[4776]: I1208 10:37:52.807906 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-metrics/0.log" Dec 08 10:37:53 crc kubenswrapper[4776]: I1208 10:37:53.005570 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-reloader/0.log" Dec 08 10:37:53 crc kubenswrapper[4776]: I1208 10:37:53.009757 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-frr-files/0.log" Dec 08 10:37:53 crc kubenswrapper[4776]: I1208 10:37:53.046204 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-metrics/0.log" Dec 08 10:37:53 crc kubenswrapper[4776]: I1208 10:37:53.048242 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-metrics/0.log" Dec 08 10:37:53 crc kubenswrapper[4776]: I1208 10:37:53.207251 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-metrics/0.log" Dec 08 10:37:53 crc kubenswrapper[4776]: I1208 10:37:53.231998 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-frr-files/0.log" Dec 08 10:37:53 crc kubenswrapper[4776]: I1208 10:37:53.252629 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/controller/0.log" Dec 08 10:37:53 crc kubenswrapper[4776]: I1208 10:37:53.254467 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-reloader/0.log" Dec 08 10:37:53 crc kubenswrapper[4776]: I1208 10:37:53.344796 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:37:53 crc kubenswrapper[4776]: E1208 10:37:53.345292 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:37:53 crc kubenswrapper[4776]: I1208 10:37:53.439213 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/frr-metrics/0.log" Dec 08 10:37:53 crc kubenswrapper[4776]: I1208 10:37:53.449735 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/kube-rbac-proxy/0.log" Dec 08 10:37:53 crc kubenswrapper[4776]: I1208 10:37:53.489925 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/kube-rbac-proxy-frr/0.log" Dec 08 10:37:53 crc kubenswrapper[4776]: I1208 10:37:53.713307 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/reloader/0.log" Dec 08 10:37:53 crc kubenswrapper[4776]: I1208 10:37:53.750605 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-c68cp_640e92fd-0236-408e-95ba-a5aacfe784d4/frr-k8s-webhook-server/0.log" Dec 08 10:37:53 crc kubenswrapper[4776]: I1208 10:37:53.981892 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7567df7f9b-ctl76_579d6f99-9917-455f-b0cf-350c24bae128/manager/0.log" Dec 08 10:37:54 crc kubenswrapper[4776]: I1208 10:37:54.214349 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7f595f4d5-92ttw_ea83e974-be12-4152-bd97-0d699c0e13b2/webhook-server/0.log" Dec 08 10:37:54 crc kubenswrapper[4776]: I1208 10:37:54.283309 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gt7wq_4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6/kube-rbac-proxy/0.log" Dec 08 10:37:55 crc kubenswrapper[4776]: I1208 10:37:55.108837 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gt7wq_4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6/speaker/0.log" Dec 08 10:37:55 crc kubenswrapper[4776]: I1208 10:37:55.423425 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/frr/0.log" Dec 08 10:38:06 crc kubenswrapper[4776]: I1208 10:38:06.346579 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:38:06 crc kubenswrapper[4776]: E1208 10:38:06.347442 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:38:07 crc kubenswrapper[4776]: I1208 10:38:07.134794 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww_20cd1aea-6a8d-458a-8697-f9193cfa6058/util/0.log" Dec 08 10:38:07 crc kubenswrapper[4776]: I1208 10:38:07.336080 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww_20cd1aea-6a8d-458a-8697-f9193cfa6058/pull/0.log" Dec 08 10:38:07 crc kubenswrapper[4776]: I1208 10:38:07.341442 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww_20cd1aea-6a8d-458a-8697-f9193cfa6058/util/0.log" Dec 08 10:38:07 crc kubenswrapper[4776]: I1208 10:38:07.366674 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww_20cd1aea-6a8d-458a-8697-f9193cfa6058/pull/0.log" Dec 08 10:38:07 crc kubenswrapper[4776]: I1208 10:38:07.497842 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww_20cd1aea-6a8d-458a-8697-f9193cfa6058/util/0.log" Dec 08 10:38:07 crc kubenswrapper[4776]: I1208 10:38:07.516707 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww_20cd1aea-6a8d-458a-8697-f9193cfa6058/pull/0.log" Dec 08 10:38:07 crc kubenswrapper[4776]: I1208 10:38:07.520237 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww_20cd1aea-6a8d-458a-8697-f9193cfa6058/extract/0.log" Dec 08 10:38:07 crc kubenswrapper[4776]: I1208 10:38:07.649831 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q_8567f1db-9f8a-49aa-8864-e18aef8b18e7/util/0.log" Dec 08 10:38:07 crc kubenswrapper[4776]: I1208 10:38:07.854974 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q_8567f1db-9f8a-49aa-8864-e18aef8b18e7/util/0.log" Dec 08 10:38:07 crc kubenswrapper[4776]: I1208 10:38:07.870041 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q_8567f1db-9f8a-49aa-8864-e18aef8b18e7/pull/0.log" Dec 08 10:38:07 crc kubenswrapper[4776]: I1208 10:38:07.883690 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q_8567f1db-9f8a-49aa-8864-e18aef8b18e7/pull/0.log" Dec 08 10:38:08 crc kubenswrapper[4776]: I1208 10:38:08.052818 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q_8567f1db-9f8a-49aa-8864-e18aef8b18e7/util/0.log" Dec 08 10:38:08 crc kubenswrapper[4776]: I1208 10:38:08.069674 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q_8567f1db-9f8a-49aa-8864-e18aef8b18e7/extract/0.log" Dec 08 10:38:08 crc kubenswrapper[4776]: I1208 10:38:08.083627 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q_8567f1db-9f8a-49aa-8864-e18aef8b18e7/pull/0.log" Dec 08 10:38:08 crc kubenswrapper[4776]: I1208 10:38:08.231243 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5_bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b/util/0.log" Dec 08 10:38:08 crc kubenswrapper[4776]: I1208 10:38:08.384386 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5_bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b/util/0.log" Dec 08 10:38:08 crc kubenswrapper[4776]: I1208 10:38:08.385519 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5_bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b/pull/0.log" Dec 08 10:38:08 crc kubenswrapper[4776]: I1208 10:38:08.393649 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5_bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b/pull/0.log" Dec 08 10:38:08 crc kubenswrapper[4776]: I1208 10:38:08.562309 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5_bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b/pull/0.log" Dec 08 10:38:08 crc kubenswrapper[4776]: I1208 10:38:08.583814 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5_bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b/util/0.log" Dec 08 10:38:08 crc kubenswrapper[4776]: I1208 10:38:08.601365 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5_bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b/extract/0.log" Dec 08 10:38:08 crc kubenswrapper[4776]: I1208 10:38:08.739463 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k_0b36dfd9-c3b8-4858-b056-70d04434052a/util/0.log" Dec 08 10:38:08 crc kubenswrapper[4776]: I1208 10:38:08.905962 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k_0b36dfd9-c3b8-4858-b056-70d04434052a/util/0.log" Dec 08 10:38:08 crc kubenswrapper[4776]: I1208 10:38:08.929811 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k_0b36dfd9-c3b8-4858-b056-70d04434052a/pull/0.log" Dec 08 10:38:08 crc kubenswrapper[4776]: I1208 10:38:08.935912 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k_0b36dfd9-c3b8-4858-b056-70d04434052a/pull/0.log" Dec 08 10:38:09 crc kubenswrapper[4776]: I1208 10:38:09.111440 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k_0b36dfd9-c3b8-4858-b056-70d04434052a/pull/0.log" Dec 08 10:38:09 crc kubenswrapper[4776]: I1208 10:38:09.113044 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k_0b36dfd9-c3b8-4858-b056-70d04434052a/util/0.log" Dec 08 10:38:09 crc kubenswrapper[4776]: I1208 10:38:09.120398 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k_0b36dfd9-c3b8-4858-b056-70d04434052a/extract/0.log" Dec 08 10:38:09 crc kubenswrapper[4776]: I1208 10:38:09.281982 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl_ecb04392-c8da-4ee9-ae5f-aa7212b963e9/util/0.log" Dec 08 10:38:09 crc kubenswrapper[4776]: I1208 10:38:09.460991 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl_ecb04392-c8da-4ee9-ae5f-aa7212b963e9/util/0.log" Dec 08 10:38:09 crc kubenswrapper[4776]: I1208 10:38:09.464060 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl_ecb04392-c8da-4ee9-ae5f-aa7212b963e9/pull/0.log" Dec 08 10:38:09 crc kubenswrapper[4776]: I1208 10:38:09.486040 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl_ecb04392-c8da-4ee9-ae5f-aa7212b963e9/pull/0.log" Dec 08 10:38:09 crc kubenswrapper[4776]: I1208 10:38:09.651609 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl_ecb04392-c8da-4ee9-ae5f-aa7212b963e9/util/0.log" Dec 08 10:38:09 crc kubenswrapper[4776]: I1208 10:38:09.700468 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl_ecb04392-c8da-4ee9-ae5f-aa7212b963e9/pull/0.log" Dec 08 10:38:09 crc kubenswrapper[4776]: I1208 10:38:09.709361 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl_ecb04392-c8da-4ee9-ae5f-aa7212b963e9/extract/0.log" Dec 08 10:38:09 crc kubenswrapper[4776]: I1208 10:38:09.831547 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dqj6b_03a186d8-ec36-41ef-b882-f0cba34a0913/extract-utilities/0.log" Dec 08 10:38:10 crc kubenswrapper[4776]: I1208 10:38:10.012339 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dqj6b_03a186d8-ec36-41ef-b882-f0cba34a0913/extract-content/0.log" Dec 08 10:38:10 crc kubenswrapper[4776]: I1208 10:38:10.016729 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dqj6b_03a186d8-ec36-41ef-b882-f0cba34a0913/extract-utilities/0.log" Dec 08 10:38:10 crc kubenswrapper[4776]: I1208 10:38:10.039024 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dqj6b_03a186d8-ec36-41ef-b882-f0cba34a0913/extract-content/0.log" Dec 08 10:38:10 crc kubenswrapper[4776]: I1208 10:38:10.216435 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dqj6b_03a186d8-ec36-41ef-b882-f0cba34a0913/extract-utilities/0.log" Dec 08 10:38:10 crc kubenswrapper[4776]: I1208 10:38:10.244114 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dqj6b_03a186d8-ec36-41ef-b882-f0cba34a0913/extract-content/0.log" Dec 08 10:38:10 crc kubenswrapper[4776]: I1208 10:38:10.470130 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbmh8_d01dc6cb-3ab5-494a-b89f-63e94c2e91ee/extract-utilities/0.log" Dec 08 10:38:10 crc kubenswrapper[4776]: I1208 10:38:10.695798 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbmh8_d01dc6cb-3ab5-494a-b89f-63e94c2e91ee/extract-utilities/0.log" Dec 08 10:38:10 crc kubenswrapper[4776]: I1208 10:38:10.735083 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbmh8_d01dc6cb-3ab5-494a-b89f-63e94c2e91ee/extract-content/0.log" Dec 08 10:38:10 crc kubenswrapper[4776]: I1208 10:38:10.738742 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbmh8_d01dc6cb-3ab5-494a-b89f-63e94c2e91ee/extract-content/0.log" Dec 08 10:38:10 crc kubenswrapper[4776]: I1208 10:38:10.948598 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbmh8_d01dc6cb-3ab5-494a-b89f-63e94c2e91ee/extract-utilities/0.log" Dec 08 10:38:10 crc kubenswrapper[4776]: I1208 10:38:10.977652 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbmh8_d01dc6cb-3ab5-494a-b89f-63e94c2e91ee/extract-content/0.log" Dec 08 10:38:11 crc kubenswrapper[4776]: I1208 10:38:11.193747 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8ctxn_b93e12f9-d5c1-4ee8-9786-85d352d62076/marketplace-operator/0.log" Dec 08 10:38:11 crc kubenswrapper[4776]: I1208 10:38:11.226594 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dqj6b_03a186d8-ec36-41ef-b882-f0cba34a0913/registry-server/0.log" Dec 08 10:38:11 crc kubenswrapper[4776]: I1208 10:38:11.268271 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-txmws_aeec433a-2b07-4008-9829-d266f85b5cf1/extract-utilities/0.log" Dec 08 10:38:11 crc kubenswrapper[4776]: I1208 10:38:11.515790 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-txmws_aeec433a-2b07-4008-9829-d266f85b5cf1/extract-utilities/0.log" Dec 08 10:38:11 crc kubenswrapper[4776]: I1208 10:38:11.526641 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-txmws_aeec433a-2b07-4008-9829-d266f85b5cf1/extract-content/0.log" Dec 08 10:38:11 crc kubenswrapper[4776]: I1208 10:38:11.541474 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-txmws_aeec433a-2b07-4008-9829-d266f85b5cf1/extract-content/0.log" Dec 08 10:38:11 crc kubenswrapper[4776]: I1208 10:38:11.747205 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-txmws_aeec433a-2b07-4008-9829-d266f85b5cf1/extract-content/0.log" Dec 08 10:38:11 crc kubenswrapper[4776]: I1208 10:38:11.797677 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-txmws_aeec433a-2b07-4008-9829-d266f85b5cf1/extract-utilities/0.log" Dec 08 10:38:12 crc kubenswrapper[4776]: I1208 10:38:12.012061 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5dnr_885bc336-6858-43f4-b63b-155ed1f06b60/extract-utilities/0.log" Dec 08 10:38:12 crc kubenswrapper[4776]: I1208 10:38:12.124533 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-txmws_aeec433a-2b07-4008-9829-d266f85b5cf1/registry-server/0.log" Dec 08 10:38:12 crc kubenswrapper[4776]: I1208 10:38:12.255827 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5dnr_885bc336-6858-43f4-b63b-155ed1f06b60/extract-content/0.log" Dec 08 10:38:12 crc kubenswrapper[4776]: I1208 10:38:12.264526 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbmh8_d01dc6cb-3ab5-494a-b89f-63e94c2e91ee/registry-server/0.log" Dec 08 10:38:12 crc kubenswrapper[4776]: I1208 10:38:12.292835 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5dnr_885bc336-6858-43f4-b63b-155ed1f06b60/extract-content/0.log" Dec 08 10:38:12 crc kubenswrapper[4776]: I1208 10:38:12.312031 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5dnr_885bc336-6858-43f4-b63b-155ed1f06b60/extract-utilities/0.log" Dec 08 10:38:12 crc kubenswrapper[4776]: I1208 10:38:12.615513 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5dnr_885bc336-6858-43f4-b63b-155ed1f06b60/extract-content/0.log" Dec 08 10:38:12 crc kubenswrapper[4776]: I1208 10:38:12.699545 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5dnr_885bc336-6858-43f4-b63b-155ed1f06b60/extract-utilities/0.log" Dec 08 10:38:12 crc kubenswrapper[4776]: I1208 10:38:12.788205 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5dnr_885bc336-6858-43f4-b63b-155ed1f06b60/registry-server/0.log" Dec 08 10:38:17 crc kubenswrapper[4776]: I1208 10:38:17.344557 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:38:17 crc kubenswrapper[4776]: E1208 10:38:17.345133 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:38:23 crc kubenswrapper[4776]: I1208 10:38:23.682917 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-4r72v_3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8/prometheus-operator/0.log" Dec 08 10:38:23 crc kubenswrapper[4776]: I1208 10:38:23.891344 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7994656576-d6jvv_ff9db296-6f02-44bf-810c-48cfb090036e/prometheus-operator-admission-webhook/0.log" Dec 08 10:38:23 crc kubenswrapper[4776]: I1208 10:38:23.920773 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7994656576-gzq75_968acbdd-ab1d-4aa4-9db9-654170c5fa2d/prometheus-operator-admission-webhook/0.log" Dec 08 10:38:24 crc kubenswrapper[4776]: I1208 10:38:24.085231 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-w69rl_251557fb-f870-4b8c-8725-648a8cd97fca/observability-ui-dashboards/0.log" Dec 08 10:38:24 crc kubenswrapper[4776]: I1208 10:38:24.092832 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-rkf5k_9108512a-718d-41db-b414-02665870be6b/operator/0.log" Dec 08 10:38:24 crc kubenswrapper[4776]: I1208 10:38:24.249016 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-bc5qm_5691addb-538a-4212-bb5b-bf797ba7172c/perses-operator/0.log" Dec 08 10:38:28 crc kubenswrapper[4776]: I1208 10:38:28.344054 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:38:28 crc kubenswrapper[4776]: E1208 10:38:28.345121 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:38:36 crc kubenswrapper[4776]: I1208 10:38:36.003693 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6bfc99889d-tq44h_9ba0d9e5-f1ab-40a6-9490-57ce8566843a/kube-rbac-proxy/0.log" Dec 08 10:38:36 crc kubenswrapper[4776]: I1208 10:38:36.040057 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6bfc99889d-tq44h_9ba0d9e5-f1ab-40a6-9490-57ce8566843a/manager/0.log" Dec 08 10:38:43 crc kubenswrapper[4776]: I1208 10:38:43.345631 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:38:43 crc kubenswrapper[4776]: E1208 10:38:43.346468 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:38:58 crc kubenswrapper[4776]: I1208 10:38:58.343847 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:38:58 crc kubenswrapper[4776]: E1208 10:38:58.345594 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:39:10 crc kubenswrapper[4776]: I1208 10:39:10.352022 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:39:10 crc kubenswrapper[4776]: E1208 10:39:10.352874 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:39:22 crc kubenswrapper[4776]: I1208 10:39:22.344661 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:39:22 crc kubenswrapper[4776]: E1208 10:39:22.345603 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:39:37 crc kubenswrapper[4776]: I1208 10:39:37.343908 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:39:37 crc kubenswrapper[4776]: E1208 10:39:37.345565 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:39:51 crc kubenswrapper[4776]: I1208 10:39:51.343977 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:39:52 crc kubenswrapper[4776]: I1208 10:39:52.217683 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"204440af754cab96c5a4e55db9a243723d836f694200606bc30e8bb3cce0cb54"} Dec 08 10:40:20 crc kubenswrapper[4776]: I1208 10:40:20.591562 4776 scope.go:117] "RemoveContainer" containerID="48cd89e8bd75188efea96c620bb708bd06549285c9f5dd30eba16d2448de7387" Dec 08 10:40:25 crc kubenswrapper[4776]: I1208 10:40:25.570018 4776 generic.go:334] "Generic (PLEG): container finished" podID="3d1b6d17-c87a-4518-ae10-0fd52d9a854e" containerID="582b79ac9f273c883790a68c4f07210c8d49a6babb467e19e18665a96da4a485" exitCode=0 Dec 08 10:40:25 crc kubenswrapper[4776]: I1208 10:40:25.570104 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xxxxw/must-gather-jk9pj" event={"ID":"3d1b6d17-c87a-4518-ae10-0fd52d9a854e","Type":"ContainerDied","Data":"582b79ac9f273c883790a68c4f07210c8d49a6babb467e19e18665a96da4a485"} Dec 08 10:40:25 crc kubenswrapper[4776]: I1208 10:40:25.571232 4776 scope.go:117] "RemoveContainer" containerID="582b79ac9f273c883790a68c4f07210c8d49a6babb467e19e18665a96da4a485" Dec 08 10:40:26 crc kubenswrapper[4776]: I1208 10:40:26.484459 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xxxxw_must-gather-jk9pj_3d1b6d17-c87a-4518-ae10-0fd52d9a854e/gather/0.log" Dec 08 10:40:34 crc kubenswrapper[4776]: I1208 10:40:34.409394 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xxxxw/must-gather-jk9pj"] Dec 08 10:40:34 crc kubenswrapper[4776]: I1208 10:40:34.410248 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-xxxxw/must-gather-jk9pj" podUID="3d1b6d17-c87a-4518-ae10-0fd52d9a854e" containerName="copy" containerID="cri-o://8fa246deb9b833b81b62fc7546dc53e78644ee5e15d0c66b001103c9c32e939b" gracePeriod=2 Dec 08 10:40:34 crc kubenswrapper[4776]: I1208 10:40:34.421725 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xxxxw/must-gather-jk9pj"] Dec 08 10:40:34 crc kubenswrapper[4776]: I1208 10:40:34.672229 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xxxxw_must-gather-jk9pj_3d1b6d17-c87a-4518-ae10-0fd52d9a854e/copy/0.log" Dec 08 10:40:34 crc kubenswrapper[4776]: I1208 10:40:34.673543 4776 generic.go:334] "Generic (PLEG): container finished" podID="3d1b6d17-c87a-4518-ae10-0fd52d9a854e" containerID="8fa246deb9b833b81b62fc7546dc53e78644ee5e15d0c66b001103c9c32e939b" exitCode=143 Dec 08 10:40:34 crc kubenswrapper[4776]: I1208 10:40:34.893911 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xxxxw_must-gather-jk9pj_3d1b6d17-c87a-4518-ae10-0fd52d9a854e/copy/0.log" Dec 08 10:40:34 crc kubenswrapper[4776]: I1208 10:40:34.894477 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xxxxw/must-gather-jk9pj" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.020100 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3d1b6d17-c87a-4518-ae10-0fd52d9a854e-must-gather-output\") pod \"3d1b6d17-c87a-4518-ae10-0fd52d9a854e\" (UID: \"3d1b6d17-c87a-4518-ae10-0fd52d9a854e\") " Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.020237 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdt4r\" (UniqueName: \"kubernetes.io/projected/3d1b6d17-c87a-4518-ae10-0fd52d9a854e-kube-api-access-sdt4r\") pod \"3d1b6d17-c87a-4518-ae10-0fd52d9a854e\" (UID: \"3d1b6d17-c87a-4518-ae10-0fd52d9a854e\") " Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.028726 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1b6d17-c87a-4518-ae10-0fd52d9a854e-kube-api-access-sdt4r" (OuterVolumeSpecName: "kube-api-access-sdt4r") pod "3d1b6d17-c87a-4518-ae10-0fd52d9a854e" (UID: "3d1b6d17-c87a-4518-ae10-0fd52d9a854e"). InnerVolumeSpecName "kube-api-access-sdt4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.122970 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdt4r\" (UniqueName: \"kubernetes.io/projected/3d1b6d17-c87a-4518-ae10-0fd52d9a854e-kube-api-access-sdt4r\") on node \"crc\" DevicePath \"\"" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.202227 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d1b6d17-c87a-4518-ae10-0fd52d9a854e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3d1b6d17-c87a-4518-ae10-0fd52d9a854e" (UID: "3d1b6d17-c87a-4518-ae10-0fd52d9a854e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.225547 4776 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3d1b6d17-c87a-4518-ae10-0fd52d9a854e-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.686732 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xxxxw_must-gather-jk9pj_3d1b6d17-c87a-4518-ae10-0fd52d9a854e/copy/0.log" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.687475 4776 scope.go:117] "RemoveContainer" containerID="8fa246deb9b833b81b62fc7546dc53e78644ee5e15d0c66b001103c9c32e939b" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.687536 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xxxxw/must-gather-jk9pj" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.713582 4776 scope.go:117] "RemoveContainer" containerID="582b79ac9f273c883790a68c4f07210c8d49a6babb467e19e18665a96da4a485" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.900659 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hsqns"] Dec 08 10:40:35 crc kubenswrapper[4776]: E1208 10:40:35.901390 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1b6d17-c87a-4518-ae10-0fd52d9a854e" containerName="copy" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.901527 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1b6d17-c87a-4518-ae10-0fd52d9a854e" containerName="copy" Dec 08 10:40:35 crc kubenswrapper[4776]: E1208 10:40:35.901615 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="636c3ebf-3c4e-47c3-b02c-663079e58d09" containerName="registry-server" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.901690 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="636c3ebf-3c4e-47c3-b02c-663079e58d09" containerName="registry-server" Dec 08 10:40:35 crc kubenswrapper[4776]: E1208 10:40:35.901893 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1b6d17-c87a-4518-ae10-0fd52d9a854e" containerName="gather" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.901974 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1b6d17-c87a-4518-ae10-0fd52d9a854e" containerName="gather" Dec 08 10:40:35 crc kubenswrapper[4776]: E1208 10:40:35.902221 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="636c3ebf-3c4e-47c3-b02c-663079e58d09" containerName="extract-utilities" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.902295 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="636c3ebf-3c4e-47c3-b02c-663079e58d09" containerName="extract-utilities" Dec 08 10:40:35 crc kubenswrapper[4776]: E1208 10:40:35.902435 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="636c3ebf-3c4e-47c3-b02c-663079e58d09" containerName="extract-content" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.902497 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="636c3ebf-3c4e-47c3-b02c-663079e58d09" containerName="extract-content" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.902792 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1b6d17-c87a-4518-ae10-0fd52d9a854e" containerName="copy" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.902883 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1b6d17-c87a-4518-ae10-0fd52d9a854e" containerName="gather" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.902940 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="636c3ebf-3c4e-47c3-b02c-663079e58d09" containerName="registry-server" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.904729 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:35 crc kubenswrapper[4776]: I1208 10:40:35.913504 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hsqns"] Dec 08 10:40:36 crc kubenswrapper[4776]: I1208 10:40:36.046972 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43bd2c0a-e68c-44ad-abb8-44b8c2108896-utilities\") pod \"community-operators-hsqns\" (UID: \"43bd2c0a-e68c-44ad-abb8-44b8c2108896\") " pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:36 crc kubenswrapper[4776]: I1208 10:40:36.047110 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpdhg\" (UniqueName: \"kubernetes.io/projected/43bd2c0a-e68c-44ad-abb8-44b8c2108896-kube-api-access-zpdhg\") pod \"community-operators-hsqns\" (UID: \"43bd2c0a-e68c-44ad-abb8-44b8c2108896\") " pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:36 crc kubenswrapper[4776]: I1208 10:40:36.047151 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43bd2c0a-e68c-44ad-abb8-44b8c2108896-catalog-content\") pod \"community-operators-hsqns\" (UID: \"43bd2c0a-e68c-44ad-abb8-44b8c2108896\") " pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:36 crc kubenswrapper[4776]: I1208 10:40:36.151959 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43bd2c0a-e68c-44ad-abb8-44b8c2108896-utilities\") pod \"community-operators-hsqns\" (UID: \"43bd2c0a-e68c-44ad-abb8-44b8c2108896\") " pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:36 crc kubenswrapper[4776]: I1208 10:40:36.152075 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpdhg\" (UniqueName: \"kubernetes.io/projected/43bd2c0a-e68c-44ad-abb8-44b8c2108896-kube-api-access-zpdhg\") pod \"community-operators-hsqns\" (UID: \"43bd2c0a-e68c-44ad-abb8-44b8c2108896\") " pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:36 crc kubenswrapper[4776]: I1208 10:40:36.152106 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43bd2c0a-e68c-44ad-abb8-44b8c2108896-catalog-content\") pod \"community-operators-hsqns\" (UID: \"43bd2c0a-e68c-44ad-abb8-44b8c2108896\") " pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:36 crc kubenswrapper[4776]: I1208 10:40:36.152806 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43bd2c0a-e68c-44ad-abb8-44b8c2108896-catalog-content\") pod \"community-operators-hsqns\" (UID: \"43bd2c0a-e68c-44ad-abb8-44b8c2108896\") " pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:36 crc kubenswrapper[4776]: I1208 10:40:36.153033 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43bd2c0a-e68c-44ad-abb8-44b8c2108896-utilities\") pod \"community-operators-hsqns\" (UID: \"43bd2c0a-e68c-44ad-abb8-44b8c2108896\") " pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:36 crc kubenswrapper[4776]: I1208 10:40:36.174452 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpdhg\" (UniqueName: \"kubernetes.io/projected/43bd2c0a-e68c-44ad-abb8-44b8c2108896-kube-api-access-zpdhg\") pod \"community-operators-hsqns\" (UID: \"43bd2c0a-e68c-44ad-abb8-44b8c2108896\") " pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:36 crc kubenswrapper[4776]: I1208 10:40:36.230912 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:36 crc kubenswrapper[4776]: I1208 10:40:36.362823 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d1b6d17-c87a-4518-ae10-0fd52d9a854e" path="/var/lib/kubelet/pods/3d1b6d17-c87a-4518-ae10-0fd52d9a854e/volumes" Dec 08 10:40:36 crc kubenswrapper[4776]: I1208 10:40:36.716045 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hsqns"] Dec 08 10:40:37 crc kubenswrapper[4776]: I1208 10:40:37.713607 4776 generic.go:334] "Generic (PLEG): container finished" podID="43bd2c0a-e68c-44ad-abb8-44b8c2108896" containerID="d444660e21f11bbd00691dee4bc340e5db5d8829b4fbef2bfbc66021f881baf9" exitCode=0 Dec 08 10:40:37 crc kubenswrapper[4776]: I1208 10:40:37.713685 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsqns" event={"ID":"43bd2c0a-e68c-44ad-abb8-44b8c2108896","Type":"ContainerDied","Data":"d444660e21f11bbd00691dee4bc340e5db5d8829b4fbef2bfbc66021f881baf9"} Dec 08 10:40:37 crc kubenswrapper[4776]: I1208 10:40:37.713934 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsqns" event={"ID":"43bd2c0a-e68c-44ad-abb8-44b8c2108896","Type":"ContainerStarted","Data":"d50d5aa6ec2777c99b142c1bc417cb2eb92c940f9af77666756a0826a0db914d"} Dec 08 10:40:39 crc kubenswrapper[4776]: I1208 10:40:39.748239 4776 generic.go:334] "Generic (PLEG): container finished" podID="43bd2c0a-e68c-44ad-abb8-44b8c2108896" containerID="2d5a1399402ae66a7ee66a5137d9ed09effcf3c2058df86a99b73495197f4ec9" exitCode=0 Dec 08 10:40:39 crc kubenswrapper[4776]: I1208 10:40:39.748300 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsqns" event={"ID":"43bd2c0a-e68c-44ad-abb8-44b8c2108896","Type":"ContainerDied","Data":"2d5a1399402ae66a7ee66a5137d9ed09effcf3c2058df86a99b73495197f4ec9"} Dec 08 10:40:40 crc kubenswrapper[4776]: I1208 10:40:40.760070 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsqns" event={"ID":"43bd2c0a-e68c-44ad-abb8-44b8c2108896","Type":"ContainerStarted","Data":"5c2040eb98f258ac73fb95751d4fc21d66bf1742915bec77c50a6bef7d3a520a"} Dec 08 10:40:40 crc kubenswrapper[4776]: I1208 10:40:40.786726 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hsqns" podStartSLOduration=3.349689342 podStartE2EDuration="5.786705236s" podCreationTimestamp="2025-12-08 10:40:35 +0000 UTC" firstStartedPulling="2025-12-08 10:40:37.717201843 +0000 UTC m=+6113.980426865" lastFinishedPulling="2025-12-08 10:40:40.154217737 +0000 UTC m=+6116.417442759" observedRunningTime="2025-12-08 10:40:40.778679508 +0000 UTC m=+6117.041904530" watchObservedRunningTime="2025-12-08 10:40:40.786705236 +0000 UTC m=+6117.049930258" Dec 08 10:40:46 crc kubenswrapper[4776]: I1208 10:40:46.231516 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:46 crc kubenswrapper[4776]: I1208 10:40:46.232085 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:46 crc kubenswrapper[4776]: I1208 10:40:46.283901 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:46 crc kubenswrapper[4776]: I1208 10:40:46.873608 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:46 crc kubenswrapper[4776]: I1208 10:40:46.921940 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hsqns"] Dec 08 10:40:48 crc kubenswrapper[4776]: I1208 10:40:48.835520 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hsqns" podUID="43bd2c0a-e68c-44ad-abb8-44b8c2108896" containerName="registry-server" containerID="cri-o://5c2040eb98f258ac73fb95751d4fc21d66bf1742915bec77c50a6bef7d3a520a" gracePeriod=2 Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.337534 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.467163 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43bd2c0a-e68c-44ad-abb8-44b8c2108896-catalog-content\") pod \"43bd2c0a-e68c-44ad-abb8-44b8c2108896\" (UID: \"43bd2c0a-e68c-44ad-abb8-44b8c2108896\") " Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.467276 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43bd2c0a-e68c-44ad-abb8-44b8c2108896-utilities\") pod \"43bd2c0a-e68c-44ad-abb8-44b8c2108896\" (UID: \"43bd2c0a-e68c-44ad-abb8-44b8c2108896\") " Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.467458 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpdhg\" (UniqueName: \"kubernetes.io/projected/43bd2c0a-e68c-44ad-abb8-44b8c2108896-kube-api-access-zpdhg\") pod \"43bd2c0a-e68c-44ad-abb8-44b8c2108896\" (UID: \"43bd2c0a-e68c-44ad-abb8-44b8c2108896\") " Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.468319 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43bd2c0a-e68c-44ad-abb8-44b8c2108896-utilities" (OuterVolumeSpecName: "utilities") pod "43bd2c0a-e68c-44ad-abb8-44b8c2108896" (UID: "43bd2c0a-e68c-44ad-abb8-44b8c2108896"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.478072 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43bd2c0a-e68c-44ad-abb8-44b8c2108896-kube-api-access-zpdhg" (OuterVolumeSpecName: "kube-api-access-zpdhg") pod "43bd2c0a-e68c-44ad-abb8-44b8c2108896" (UID: "43bd2c0a-e68c-44ad-abb8-44b8c2108896"). InnerVolumeSpecName "kube-api-access-zpdhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.525421 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43bd2c0a-e68c-44ad-abb8-44b8c2108896-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43bd2c0a-e68c-44ad-abb8-44b8c2108896" (UID: "43bd2c0a-e68c-44ad-abb8-44b8c2108896"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.571118 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpdhg\" (UniqueName: \"kubernetes.io/projected/43bd2c0a-e68c-44ad-abb8-44b8c2108896-kube-api-access-zpdhg\") on node \"crc\" DevicePath \"\"" Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.571165 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43bd2c0a-e68c-44ad-abb8-44b8c2108896-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.571194 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43bd2c0a-e68c-44ad-abb8-44b8c2108896-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.847449 4776 generic.go:334] "Generic (PLEG): container finished" podID="43bd2c0a-e68c-44ad-abb8-44b8c2108896" containerID="5c2040eb98f258ac73fb95751d4fc21d66bf1742915bec77c50a6bef7d3a520a" exitCode=0 Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.847499 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsqns" event={"ID":"43bd2c0a-e68c-44ad-abb8-44b8c2108896","Type":"ContainerDied","Data":"5c2040eb98f258ac73fb95751d4fc21d66bf1742915bec77c50a6bef7d3a520a"} Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.847507 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hsqns" Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.847540 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsqns" event={"ID":"43bd2c0a-e68c-44ad-abb8-44b8c2108896","Type":"ContainerDied","Data":"d50d5aa6ec2777c99b142c1bc417cb2eb92c940f9af77666756a0826a0db914d"} Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.847577 4776 scope.go:117] "RemoveContainer" containerID="5c2040eb98f258ac73fb95751d4fc21d66bf1742915bec77c50a6bef7d3a520a" Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.870520 4776 scope.go:117] "RemoveContainer" containerID="2d5a1399402ae66a7ee66a5137d9ed09effcf3c2058df86a99b73495197f4ec9" Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.879979 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hsqns"] Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.891876 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hsqns"] Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.905570 4776 scope.go:117] "RemoveContainer" containerID="d444660e21f11bbd00691dee4bc340e5db5d8829b4fbef2bfbc66021f881baf9" Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.958891 4776 scope.go:117] "RemoveContainer" containerID="5c2040eb98f258ac73fb95751d4fc21d66bf1742915bec77c50a6bef7d3a520a" Dec 08 10:40:49 crc kubenswrapper[4776]: E1208 10:40:49.959386 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c2040eb98f258ac73fb95751d4fc21d66bf1742915bec77c50a6bef7d3a520a\": container with ID starting with 5c2040eb98f258ac73fb95751d4fc21d66bf1742915bec77c50a6bef7d3a520a not found: ID does not exist" containerID="5c2040eb98f258ac73fb95751d4fc21d66bf1742915bec77c50a6bef7d3a520a" Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.959446 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2040eb98f258ac73fb95751d4fc21d66bf1742915bec77c50a6bef7d3a520a"} err="failed to get container status \"5c2040eb98f258ac73fb95751d4fc21d66bf1742915bec77c50a6bef7d3a520a\": rpc error: code = NotFound desc = could not find container \"5c2040eb98f258ac73fb95751d4fc21d66bf1742915bec77c50a6bef7d3a520a\": container with ID starting with 5c2040eb98f258ac73fb95751d4fc21d66bf1742915bec77c50a6bef7d3a520a not found: ID does not exist" Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.959478 4776 scope.go:117] "RemoveContainer" containerID="2d5a1399402ae66a7ee66a5137d9ed09effcf3c2058df86a99b73495197f4ec9" Dec 08 10:40:49 crc kubenswrapper[4776]: E1208 10:40:49.959982 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5a1399402ae66a7ee66a5137d9ed09effcf3c2058df86a99b73495197f4ec9\": container with ID starting with 2d5a1399402ae66a7ee66a5137d9ed09effcf3c2058df86a99b73495197f4ec9 not found: ID does not exist" containerID="2d5a1399402ae66a7ee66a5137d9ed09effcf3c2058df86a99b73495197f4ec9" Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.960015 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5a1399402ae66a7ee66a5137d9ed09effcf3c2058df86a99b73495197f4ec9"} err="failed to get container status \"2d5a1399402ae66a7ee66a5137d9ed09effcf3c2058df86a99b73495197f4ec9\": rpc error: code = NotFound desc = could not find container \"2d5a1399402ae66a7ee66a5137d9ed09effcf3c2058df86a99b73495197f4ec9\": container with ID starting with 2d5a1399402ae66a7ee66a5137d9ed09effcf3c2058df86a99b73495197f4ec9 not found: ID does not exist" Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.960039 4776 scope.go:117] "RemoveContainer" containerID="d444660e21f11bbd00691dee4bc340e5db5d8829b4fbef2bfbc66021f881baf9" Dec 08 10:40:49 crc kubenswrapper[4776]: E1208 10:40:49.960372 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d444660e21f11bbd00691dee4bc340e5db5d8829b4fbef2bfbc66021f881baf9\": container with ID starting with d444660e21f11bbd00691dee4bc340e5db5d8829b4fbef2bfbc66021f881baf9 not found: ID does not exist" containerID="d444660e21f11bbd00691dee4bc340e5db5d8829b4fbef2bfbc66021f881baf9" Dec 08 10:40:49 crc kubenswrapper[4776]: I1208 10:40:49.960393 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d444660e21f11bbd00691dee4bc340e5db5d8829b4fbef2bfbc66021f881baf9"} err="failed to get container status \"d444660e21f11bbd00691dee4bc340e5db5d8829b4fbef2bfbc66021f881baf9\": rpc error: code = NotFound desc = could not find container \"d444660e21f11bbd00691dee4bc340e5db5d8829b4fbef2bfbc66021f881baf9\": container with ID starting with d444660e21f11bbd00691dee4bc340e5db5d8829b4fbef2bfbc66021f881baf9 not found: ID does not exist" Dec 08 10:40:50 crc kubenswrapper[4776]: I1208 10:40:50.356847 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43bd2c0a-e68c-44ad-abb8-44b8c2108896" path="/var/lib/kubelet/pods/43bd2c0a-e68c-44ad-abb8-44b8c2108896/volumes" Dec 08 10:40:52 crc kubenswrapper[4776]: I1208 10:40:52.466486 4776 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 08 10:40:52 crc kubenswrapper[4776]: I1208 10:40:52.466873 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.190360 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kq45h"] Dec 08 10:41:07 crc kubenswrapper[4776]: E1208 10:41:07.191510 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43bd2c0a-e68c-44ad-abb8-44b8c2108896" containerName="extract-content" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.191528 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bd2c0a-e68c-44ad-abb8-44b8c2108896" containerName="extract-content" Dec 08 10:41:07 crc kubenswrapper[4776]: E1208 10:41:07.191591 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43bd2c0a-e68c-44ad-abb8-44b8c2108896" containerName="registry-server" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.191600 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bd2c0a-e68c-44ad-abb8-44b8c2108896" containerName="registry-server" Dec 08 10:41:07 crc kubenswrapper[4776]: E1208 10:41:07.191622 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43bd2c0a-e68c-44ad-abb8-44b8c2108896" containerName="extract-utilities" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.191631 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bd2c0a-e68c-44ad-abb8-44b8c2108896" containerName="extract-utilities" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.191893 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="43bd2c0a-e68c-44ad-abb8-44b8c2108896" containerName="registry-server" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.193952 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.206529 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kq45h"] Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.291575 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2dd6c28-86de-46c3-8d21-6b395e730e60-catalog-content\") pod \"redhat-operators-kq45h\" (UID: \"e2dd6c28-86de-46c3-8d21-6b395e730e60\") " pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.291640 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g89db\" (UniqueName: \"kubernetes.io/projected/e2dd6c28-86de-46c3-8d21-6b395e730e60-kube-api-access-g89db\") pod \"redhat-operators-kq45h\" (UID: \"e2dd6c28-86de-46c3-8d21-6b395e730e60\") " pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.291673 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2dd6c28-86de-46c3-8d21-6b395e730e60-utilities\") pod \"redhat-operators-kq45h\" (UID: \"e2dd6c28-86de-46c3-8d21-6b395e730e60\") " pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.394241 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2dd6c28-86de-46c3-8d21-6b395e730e60-catalog-content\") pod \"redhat-operators-kq45h\" (UID: \"e2dd6c28-86de-46c3-8d21-6b395e730e60\") " pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.394307 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g89db\" (UniqueName: \"kubernetes.io/projected/e2dd6c28-86de-46c3-8d21-6b395e730e60-kube-api-access-g89db\") pod \"redhat-operators-kq45h\" (UID: \"e2dd6c28-86de-46c3-8d21-6b395e730e60\") " pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.394335 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2dd6c28-86de-46c3-8d21-6b395e730e60-utilities\") pod \"redhat-operators-kq45h\" (UID: \"e2dd6c28-86de-46c3-8d21-6b395e730e60\") " pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.394770 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2dd6c28-86de-46c3-8d21-6b395e730e60-catalog-content\") pod \"redhat-operators-kq45h\" (UID: \"e2dd6c28-86de-46c3-8d21-6b395e730e60\") " pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.394878 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2dd6c28-86de-46c3-8d21-6b395e730e60-utilities\") pod \"redhat-operators-kq45h\" (UID: \"e2dd6c28-86de-46c3-8d21-6b395e730e60\") " pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.415187 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g89db\" (UniqueName: \"kubernetes.io/projected/e2dd6c28-86de-46c3-8d21-6b395e730e60-kube-api-access-g89db\") pod \"redhat-operators-kq45h\" (UID: \"e2dd6c28-86de-46c3-8d21-6b395e730e60\") " pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.523141 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:07 crc kubenswrapper[4776]: I1208 10:41:07.986344 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kq45h"] Dec 08 10:41:08 crc kubenswrapper[4776]: I1208 10:41:08.047398 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kq45h" event={"ID":"e2dd6c28-86de-46c3-8d21-6b395e730e60","Type":"ContainerStarted","Data":"77697db823ee0a3f0a81ed90f4f1f101afbd8c41f8b97ad902a8cb8d104dddb6"} Dec 08 10:41:09 crc kubenswrapper[4776]: I1208 10:41:09.061426 4776 generic.go:334] "Generic (PLEG): container finished" podID="e2dd6c28-86de-46c3-8d21-6b395e730e60" containerID="7d2774bf0fb9679a0bf3f42524563e3fe9a9e9b2e1812550a934f97441e31a1f" exitCode=0 Dec 08 10:41:09 crc kubenswrapper[4776]: I1208 10:41:09.061490 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kq45h" event={"ID":"e2dd6c28-86de-46c3-8d21-6b395e730e60","Type":"ContainerDied","Data":"7d2774bf0fb9679a0bf3f42524563e3fe9a9e9b2e1812550a934f97441e31a1f"} Dec 08 10:41:09 crc kubenswrapper[4776]: I1208 10:41:09.064253 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 10:41:10 crc kubenswrapper[4776]: I1208 10:41:10.075010 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kq45h" event={"ID":"e2dd6c28-86de-46c3-8d21-6b395e730e60","Type":"ContainerStarted","Data":"59a4dfda7e1f399495dc62c3533acbaa9faf3b6c1621083488be25270c2913b1"} Dec 08 10:41:14 crc kubenswrapper[4776]: I1208 10:41:14.137140 4776 generic.go:334] "Generic (PLEG): container finished" podID="e2dd6c28-86de-46c3-8d21-6b395e730e60" containerID="59a4dfda7e1f399495dc62c3533acbaa9faf3b6c1621083488be25270c2913b1" exitCode=0 Dec 08 10:41:14 crc kubenswrapper[4776]: I1208 10:41:14.137438 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kq45h" event={"ID":"e2dd6c28-86de-46c3-8d21-6b395e730e60","Type":"ContainerDied","Data":"59a4dfda7e1f399495dc62c3533acbaa9faf3b6c1621083488be25270c2913b1"} Dec 08 10:41:16 crc kubenswrapper[4776]: I1208 10:41:16.171534 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kq45h" event={"ID":"e2dd6c28-86de-46c3-8d21-6b395e730e60","Type":"ContainerStarted","Data":"d3cfb0f29b33f337efdcd3402628bfbb16d5c36a9eb685d56e6731a4cc969786"} Dec 08 10:41:16 crc kubenswrapper[4776]: I1208 10:41:16.203524 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kq45h" podStartSLOduration=2.974652134 podStartE2EDuration="9.203497319s" podCreationTimestamp="2025-12-08 10:41:07 +0000 UTC" firstStartedPulling="2025-12-08 10:41:09.06395027 +0000 UTC m=+6145.327175292" lastFinishedPulling="2025-12-08 10:41:15.292795455 +0000 UTC m=+6151.556020477" observedRunningTime="2025-12-08 10:41:16.192323094 +0000 UTC m=+6152.455548156" watchObservedRunningTime="2025-12-08 10:41:16.203497319 +0000 UTC m=+6152.466722341" Dec 08 10:41:17 crc kubenswrapper[4776]: I1208 10:41:17.523572 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:17 crc kubenswrapper[4776]: I1208 10:41:17.523881 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:18 crc kubenswrapper[4776]: I1208 10:41:18.574414 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kq45h" podUID="e2dd6c28-86de-46c3-8d21-6b395e730e60" containerName="registry-server" probeResult="failure" output=< Dec 08 10:41:18 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 10:41:18 crc kubenswrapper[4776]: > Dec 08 10:41:20 crc kubenswrapper[4776]: I1208 10:41:20.675155 4776 scope.go:117] "RemoveContainer" containerID="d2eff13474c0fdf262fd4d564cf6f37cd07e6b2f5bce2184a44f615010e88e0f" Dec 08 10:41:27 crc kubenswrapper[4776]: I1208 10:41:27.575675 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:27 crc kubenswrapper[4776]: I1208 10:41:27.644219 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:27 crc kubenswrapper[4776]: I1208 10:41:27.815986 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kq45h"] Dec 08 10:41:29 crc kubenswrapper[4776]: I1208 10:41:29.315434 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kq45h" podUID="e2dd6c28-86de-46c3-8d21-6b395e730e60" containerName="registry-server" containerID="cri-o://d3cfb0f29b33f337efdcd3402628bfbb16d5c36a9eb685d56e6731a4cc969786" gracePeriod=2 Dec 08 10:41:29 crc kubenswrapper[4776]: I1208 10:41:29.808688 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:29 crc kubenswrapper[4776]: I1208 10:41:29.947973 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g89db\" (UniqueName: \"kubernetes.io/projected/e2dd6c28-86de-46c3-8d21-6b395e730e60-kube-api-access-g89db\") pod \"e2dd6c28-86de-46c3-8d21-6b395e730e60\" (UID: \"e2dd6c28-86de-46c3-8d21-6b395e730e60\") " Dec 08 10:41:29 crc kubenswrapper[4776]: I1208 10:41:29.948752 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2dd6c28-86de-46c3-8d21-6b395e730e60-catalog-content\") pod \"e2dd6c28-86de-46c3-8d21-6b395e730e60\" (UID: \"e2dd6c28-86de-46c3-8d21-6b395e730e60\") " Dec 08 10:41:29 crc kubenswrapper[4776]: I1208 10:41:29.948945 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2dd6c28-86de-46c3-8d21-6b395e730e60-utilities\") pod \"e2dd6c28-86de-46c3-8d21-6b395e730e60\" (UID: \"e2dd6c28-86de-46c3-8d21-6b395e730e60\") " Dec 08 10:41:29 crc kubenswrapper[4776]: I1208 10:41:29.949576 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2dd6c28-86de-46c3-8d21-6b395e730e60-utilities" (OuterVolumeSpecName: "utilities") pod "e2dd6c28-86de-46c3-8d21-6b395e730e60" (UID: "e2dd6c28-86de-46c3-8d21-6b395e730e60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:41:29 crc kubenswrapper[4776]: I1208 10:41:29.949761 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2dd6c28-86de-46c3-8d21-6b395e730e60-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:41:29 crc kubenswrapper[4776]: I1208 10:41:29.954447 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2dd6c28-86de-46c3-8d21-6b395e730e60-kube-api-access-g89db" (OuterVolumeSpecName: "kube-api-access-g89db") pod "e2dd6c28-86de-46c3-8d21-6b395e730e60" (UID: "e2dd6c28-86de-46c3-8d21-6b395e730e60"). InnerVolumeSpecName "kube-api-access-g89db". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.051688 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g89db\" (UniqueName: \"kubernetes.io/projected/e2dd6c28-86de-46c3-8d21-6b395e730e60-kube-api-access-g89db\") on node \"crc\" DevicePath \"\"" Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.052981 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2dd6c28-86de-46c3-8d21-6b395e730e60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2dd6c28-86de-46c3-8d21-6b395e730e60" (UID: "e2dd6c28-86de-46c3-8d21-6b395e730e60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.153899 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2dd6c28-86de-46c3-8d21-6b395e730e60-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.325848 4776 generic.go:334] "Generic (PLEG): container finished" podID="e2dd6c28-86de-46c3-8d21-6b395e730e60" containerID="d3cfb0f29b33f337efdcd3402628bfbb16d5c36a9eb685d56e6731a4cc969786" exitCode=0 Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.325890 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kq45h" event={"ID":"e2dd6c28-86de-46c3-8d21-6b395e730e60","Type":"ContainerDied","Data":"d3cfb0f29b33f337efdcd3402628bfbb16d5c36a9eb685d56e6731a4cc969786"} Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.325916 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kq45h" event={"ID":"e2dd6c28-86de-46c3-8d21-6b395e730e60","Type":"ContainerDied","Data":"77697db823ee0a3f0a81ed90f4f1f101afbd8c41f8b97ad902a8cb8d104dddb6"} Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.325918 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kq45h" Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.325939 4776 scope.go:117] "RemoveContainer" containerID="d3cfb0f29b33f337efdcd3402628bfbb16d5c36a9eb685d56e6731a4cc969786" Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.346641 4776 scope.go:117] "RemoveContainer" containerID="59a4dfda7e1f399495dc62c3533acbaa9faf3b6c1621083488be25270c2913b1" Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.371414 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kq45h"] Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.379643 4776 scope.go:117] "RemoveContainer" containerID="7d2774bf0fb9679a0bf3f42524563e3fe9a9e9b2e1812550a934f97441e31a1f" Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.387564 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kq45h"] Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.437937 4776 scope.go:117] "RemoveContainer" containerID="d3cfb0f29b33f337efdcd3402628bfbb16d5c36a9eb685d56e6731a4cc969786" Dec 08 10:41:30 crc kubenswrapper[4776]: E1208 10:41:30.438489 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3cfb0f29b33f337efdcd3402628bfbb16d5c36a9eb685d56e6731a4cc969786\": container with ID starting with d3cfb0f29b33f337efdcd3402628bfbb16d5c36a9eb685d56e6731a4cc969786 not found: ID does not exist" containerID="d3cfb0f29b33f337efdcd3402628bfbb16d5c36a9eb685d56e6731a4cc969786" Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.438545 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3cfb0f29b33f337efdcd3402628bfbb16d5c36a9eb685d56e6731a4cc969786"} err="failed to get container status \"d3cfb0f29b33f337efdcd3402628bfbb16d5c36a9eb685d56e6731a4cc969786\": rpc error: code = NotFound desc = could not find container \"d3cfb0f29b33f337efdcd3402628bfbb16d5c36a9eb685d56e6731a4cc969786\": container with ID starting with d3cfb0f29b33f337efdcd3402628bfbb16d5c36a9eb685d56e6731a4cc969786 not found: ID does not exist" Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.438577 4776 scope.go:117] "RemoveContainer" containerID="59a4dfda7e1f399495dc62c3533acbaa9faf3b6c1621083488be25270c2913b1" Dec 08 10:41:30 crc kubenswrapper[4776]: E1208 10:41:30.438862 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59a4dfda7e1f399495dc62c3533acbaa9faf3b6c1621083488be25270c2913b1\": container with ID starting with 59a4dfda7e1f399495dc62c3533acbaa9faf3b6c1621083488be25270c2913b1 not found: ID does not exist" containerID="59a4dfda7e1f399495dc62c3533acbaa9faf3b6c1621083488be25270c2913b1" Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.438902 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a4dfda7e1f399495dc62c3533acbaa9faf3b6c1621083488be25270c2913b1"} err="failed to get container status \"59a4dfda7e1f399495dc62c3533acbaa9faf3b6c1621083488be25270c2913b1\": rpc error: code = NotFound desc = could not find container \"59a4dfda7e1f399495dc62c3533acbaa9faf3b6c1621083488be25270c2913b1\": container with ID starting with 59a4dfda7e1f399495dc62c3533acbaa9faf3b6c1621083488be25270c2913b1 not found: ID does not exist" Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.438924 4776 scope.go:117] "RemoveContainer" containerID="7d2774bf0fb9679a0bf3f42524563e3fe9a9e9b2e1812550a934f97441e31a1f" Dec 08 10:41:30 crc kubenswrapper[4776]: E1208 10:41:30.439474 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d2774bf0fb9679a0bf3f42524563e3fe9a9e9b2e1812550a934f97441e31a1f\": container with ID starting with 7d2774bf0fb9679a0bf3f42524563e3fe9a9e9b2e1812550a934f97441e31a1f not found: ID does not exist" containerID="7d2774bf0fb9679a0bf3f42524563e3fe9a9e9b2e1812550a934f97441e31a1f" Dec 08 10:41:30 crc kubenswrapper[4776]: I1208 10:41:30.439516 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2774bf0fb9679a0bf3f42524563e3fe9a9e9b2e1812550a934f97441e31a1f"} err="failed to get container status \"7d2774bf0fb9679a0bf3f42524563e3fe9a9e9b2e1812550a934f97441e31a1f\": rpc error: code = NotFound desc = could not find container \"7d2774bf0fb9679a0bf3f42524563e3fe9a9e9b2e1812550a934f97441e31a1f\": container with ID starting with 7d2774bf0fb9679a0bf3f42524563e3fe9a9e9b2e1812550a934f97441e31a1f not found: ID does not exist" Dec 08 10:41:30 crc kubenswrapper[4776]: E1208 10:41:30.500856 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2dd6c28_86de_46c3_8d21_6b395e730e60.slice/crio-77697db823ee0a3f0a81ed90f4f1f101afbd8c41f8b97ad902a8cb8d104dddb6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2dd6c28_86de_46c3_8d21_6b395e730e60.slice\": RecentStats: unable to find data in memory cache]" Dec 08 10:41:32 crc kubenswrapper[4776]: I1208 10:41:32.356467 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2dd6c28-86de-46c3-8d21-6b395e730e60" path="/var/lib/kubelet/pods/e2dd6c28-86de-46c3-8d21-6b395e730e60/volumes" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.113209 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l4cmt"] Dec 08 10:41:38 crc kubenswrapper[4776]: E1208 10:41:38.114228 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2dd6c28-86de-46c3-8d21-6b395e730e60" containerName="extract-utilities" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.114241 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2dd6c28-86de-46c3-8d21-6b395e730e60" containerName="extract-utilities" Dec 08 10:41:38 crc kubenswrapper[4776]: E1208 10:41:38.114275 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2dd6c28-86de-46c3-8d21-6b395e730e60" containerName="registry-server" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.114281 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2dd6c28-86de-46c3-8d21-6b395e730e60" containerName="registry-server" Dec 08 10:41:38 crc kubenswrapper[4776]: E1208 10:41:38.114293 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2dd6c28-86de-46c3-8d21-6b395e730e60" containerName="extract-content" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.114299 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2dd6c28-86de-46c3-8d21-6b395e730e60" containerName="extract-content" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.114532 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2dd6c28-86de-46c3-8d21-6b395e730e60" containerName="registry-server" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.116322 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.129457 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l4cmt"] Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.244511 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bda4600c-09ba-497d-a190-097fe0fa4a23-catalog-content\") pod \"certified-operators-l4cmt\" (UID: \"bda4600c-09ba-497d-a190-097fe0fa4a23\") " pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.245060 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bda4600c-09ba-497d-a190-097fe0fa4a23-utilities\") pod \"certified-operators-l4cmt\" (UID: \"bda4600c-09ba-497d-a190-097fe0fa4a23\") " pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.245119 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgk5v\" (UniqueName: \"kubernetes.io/projected/bda4600c-09ba-497d-a190-097fe0fa4a23-kube-api-access-qgk5v\") pod \"certified-operators-l4cmt\" (UID: \"bda4600c-09ba-497d-a190-097fe0fa4a23\") " pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.347629 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bda4600c-09ba-497d-a190-097fe0fa4a23-utilities\") pod \"certified-operators-l4cmt\" (UID: \"bda4600c-09ba-497d-a190-097fe0fa4a23\") " pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.347729 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgk5v\" (UniqueName: \"kubernetes.io/projected/bda4600c-09ba-497d-a190-097fe0fa4a23-kube-api-access-qgk5v\") pod \"certified-operators-l4cmt\" (UID: \"bda4600c-09ba-497d-a190-097fe0fa4a23\") " pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.347859 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bda4600c-09ba-497d-a190-097fe0fa4a23-catalog-content\") pod \"certified-operators-l4cmt\" (UID: \"bda4600c-09ba-497d-a190-097fe0fa4a23\") " pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.348029 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bda4600c-09ba-497d-a190-097fe0fa4a23-utilities\") pod \"certified-operators-l4cmt\" (UID: \"bda4600c-09ba-497d-a190-097fe0fa4a23\") " pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.348307 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bda4600c-09ba-497d-a190-097fe0fa4a23-catalog-content\") pod \"certified-operators-l4cmt\" (UID: \"bda4600c-09ba-497d-a190-097fe0fa4a23\") " pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.371099 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgk5v\" (UniqueName: \"kubernetes.io/projected/bda4600c-09ba-497d-a190-097fe0fa4a23-kube-api-access-qgk5v\") pod \"certified-operators-l4cmt\" (UID: \"bda4600c-09ba-497d-a190-097fe0fa4a23\") " pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.438710 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:38 crc kubenswrapper[4776]: I1208 10:41:38.918374 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l4cmt"] Dec 08 10:41:39 crc kubenswrapper[4776]: I1208 10:41:39.420798 4776 generic.go:334] "Generic (PLEG): container finished" podID="bda4600c-09ba-497d-a190-097fe0fa4a23" containerID="a91251ff936db856264aee8f1d7eb08ae86c13c4aa64e484961964cfbfa90e77" exitCode=0 Dec 08 10:41:39 crc kubenswrapper[4776]: I1208 10:41:39.420909 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l4cmt" event={"ID":"bda4600c-09ba-497d-a190-097fe0fa4a23","Type":"ContainerDied","Data":"a91251ff936db856264aee8f1d7eb08ae86c13c4aa64e484961964cfbfa90e77"} Dec 08 10:41:39 crc kubenswrapper[4776]: I1208 10:41:39.421955 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l4cmt" event={"ID":"bda4600c-09ba-497d-a190-097fe0fa4a23","Type":"ContainerStarted","Data":"269fd674b104ee008454c1f24abdc8beaba78895fbb2d5de4fab0b5aafe3fbc5"} Dec 08 10:41:40 crc kubenswrapper[4776]: I1208 10:41:40.434831 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l4cmt" event={"ID":"bda4600c-09ba-497d-a190-097fe0fa4a23","Type":"ContainerStarted","Data":"cd36b16f252cf3fac134f86049d2a7a369d4c5556afee497a0a96c1563d302f0"} Dec 08 10:41:41 crc kubenswrapper[4776]: E1208 10:41:41.259418 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda4600c_09ba_497d_a190_097fe0fa4a23.slice/crio-cd36b16f252cf3fac134f86049d2a7a369d4c5556afee497a0a96c1563d302f0.scope\": RecentStats: unable to find data in memory cache]" Dec 08 10:41:41 crc kubenswrapper[4776]: I1208 10:41:41.444283 4776 generic.go:334] "Generic (PLEG): container finished" podID="bda4600c-09ba-497d-a190-097fe0fa4a23" containerID="cd36b16f252cf3fac134f86049d2a7a369d4c5556afee497a0a96c1563d302f0" exitCode=0 Dec 08 10:41:41 crc kubenswrapper[4776]: I1208 10:41:41.444325 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l4cmt" event={"ID":"bda4600c-09ba-497d-a190-097fe0fa4a23","Type":"ContainerDied","Data":"cd36b16f252cf3fac134f86049d2a7a369d4c5556afee497a0a96c1563d302f0"} Dec 08 10:41:42 crc kubenswrapper[4776]: I1208 10:41:42.456633 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l4cmt" event={"ID":"bda4600c-09ba-497d-a190-097fe0fa4a23","Type":"ContainerStarted","Data":"c1edc80befd191477a3bf2a07089822992d92333019fb19895d98331305e7891"} Dec 08 10:41:42 crc kubenswrapper[4776]: I1208 10:41:42.479797 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l4cmt" podStartSLOduration=2.065116788 podStartE2EDuration="4.479776653s" podCreationTimestamp="2025-12-08 10:41:38 +0000 UTC" firstStartedPulling="2025-12-08 10:41:39.422567739 +0000 UTC m=+6175.685792761" lastFinishedPulling="2025-12-08 10:41:41.837227604 +0000 UTC m=+6178.100452626" observedRunningTime="2025-12-08 10:41:42.472034442 +0000 UTC m=+6178.735259464" watchObservedRunningTime="2025-12-08 10:41:42.479776653 +0000 UTC m=+6178.743001675" Dec 08 10:41:48 crc kubenswrapper[4776]: I1208 10:41:48.440062 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:48 crc kubenswrapper[4776]: I1208 10:41:48.440640 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:48 crc kubenswrapper[4776]: I1208 10:41:48.486062 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:48 crc kubenswrapper[4776]: I1208 10:41:48.560262 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:52 crc kubenswrapper[4776]: I1208 10:41:52.120251 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l4cmt"] Dec 08 10:41:52 crc kubenswrapper[4776]: I1208 10:41:52.121547 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l4cmt" podUID="bda4600c-09ba-497d-a190-097fe0fa4a23" containerName="registry-server" containerID="cri-o://c1edc80befd191477a3bf2a07089822992d92333019fb19895d98331305e7891" gracePeriod=2 Dec 08 10:41:52 crc kubenswrapper[4776]: I1208 10:41:52.564723 4776 generic.go:334] "Generic (PLEG): container finished" podID="bda4600c-09ba-497d-a190-097fe0fa4a23" containerID="c1edc80befd191477a3bf2a07089822992d92333019fb19895d98331305e7891" exitCode=0 Dec 08 10:41:52 crc kubenswrapper[4776]: I1208 10:41:52.564764 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l4cmt" event={"ID":"bda4600c-09ba-497d-a190-097fe0fa4a23","Type":"ContainerDied","Data":"c1edc80befd191477a3bf2a07089822992d92333019fb19895d98331305e7891"} Dec 08 10:41:52 crc kubenswrapper[4776]: I1208 10:41:52.763156 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:52 crc kubenswrapper[4776]: I1208 10:41:52.866596 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgk5v\" (UniqueName: \"kubernetes.io/projected/bda4600c-09ba-497d-a190-097fe0fa4a23-kube-api-access-qgk5v\") pod \"bda4600c-09ba-497d-a190-097fe0fa4a23\" (UID: \"bda4600c-09ba-497d-a190-097fe0fa4a23\") " Dec 08 10:41:52 crc kubenswrapper[4776]: I1208 10:41:52.866662 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bda4600c-09ba-497d-a190-097fe0fa4a23-utilities\") pod \"bda4600c-09ba-497d-a190-097fe0fa4a23\" (UID: \"bda4600c-09ba-497d-a190-097fe0fa4a23\") " Dec 08 10:41:52 crc kubenswrapper[4776]: I1208 10:41:52.866750 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bda4600c-09ba-497d-a190-097fe0fa4a23-catalog-content\") pod \"bda4600c-09ba-497d-a190-097fe0fa4a23\" (UID: \"bda4600c-09ba-497d-a190-097fe0fa4a23\") " Dec 08 10:41:52 crc kubenswrapper[4776]: I1208 10:41:52.867774 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bda4600c-09ba-497d-a190-097fe0fa4a23-utilities" (OuterVolumeSpecName: "utilities") pod "bda4600c-09ba-497d-a190-097fe0fa4a23" (UID: "bda4600c-09ba-497d-a190-097fe0fa4a23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:41:52 crc kubenswrapper[4776]: I1208 10:41:52.872258 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda4600c-09ba-497d-a190-097fe0fa4a23-kube-api-access-qgk5v" (OuterVolumeSpecName: "kube-api-access-qgk5v") pod "bda4600c-09ba-497d-a190-097fe0fa4a23" (UID: "bda4600c-09ba-497d-a190-097fe0fa4a23"). InnerVolumeSpecName "kube-api-access-qgk5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:41:52 crc kubenswrapper[4776]: I1208 10:41:52.916344 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bda4600c-09ba-497d-a190-097fe0fa4a23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bda4600c-09ba-497d-a190-097fe0fa4a23" (UID: "bda4600c-09ba-497d-a190-097fe0fa4a23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:41:52 crc kubenswrapper[4776]: I1208 10:41:52.970508 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgk5v\" (UniqueName: \"kubernetes.io/projected/bda4600c-09ba-497d-a190-097fe0fa4a23-kube-api-access-qgk5v\") on node \"crc\" DevicePath \"\"" Dec 08 10:41:52 crc kubenswrapper[4776]: I1208 10:41:52.970545 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bda4600c-09ba-497d-a190-097fe0fa4a23-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:41:52 crc kubenswrapper[4776]: I1208 10:41:52.970558 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bda4600c-09ba-497d-a190-097fe0fa4a23-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:41:53 crc kubenswrapper[4776]: I1208 10:41:53.576915 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l4cmt" event={"ID":"bda4600c-09ba-497d-a190-097fe0fa4a23","Type":"ContainerDied","Data":"269fd674b104ee008454c1f24abdc8beaba78895fbb2d5de4fab0b5aafe3fbc5"} Dec 08 10:41:53 crc kubenswrapper[4776]: I1208 10:41:53.577287 4776 scope.go:117] "RemoveContainer" containerID="c1edc80befd191477a3bf2a07089822992d92333019fb19895d98331305e7891" Dec 08 10:41:53 crc kubenswrapper[4776]: I1208 10:41:53.576981 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l4cmt" Dec 08 10:41:53 crc kubenswrapper[4776]: I1208 10:41:53.607010 4776 scope.go:117] "RemoveContainer" containerID="cd36b16f252cf3fac134f86049d2a7a369d4c5556afee497a0a96c1563d302f0" Dec 08 10:41:53 crc kubenswrapper[4776]: I1208 10:41:53.614818 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l4cmt"] Dec 08 10:41:53 crc kubenswrapper[4776]: I1208 10:41:53.627368 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l4cmt"] Dec 08 10:41:53 crc kubenswrapper[4776]: I1208 10:41:53.637812 4776 scope.go:117] "RemoveContainer" containerID="a91251ff936db856264aee8f1d7eb08ae86c13c4aa64e484961964cfbfa90e77" Dec 08 10:41:54 crc kubenswrapper[4776]: I1208 10:41:54.358803 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bda4600c-09ba-497d-a190-097fe0fa4a23" path="/var/lib/kubelet/pods/bda4600c-09ba-497d-a190-097fe0fa4a23/volumes" Dec 08 10:42:11 crc kubenswrapper[4776]: I1208 10:42:11.399417 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:42:11 crc kubenswrapper[4776]: I1208 10:42:11.399940 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:42:41 crc kubenswrapper[4776]: I1208 10:42:41.399497 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:42:41 crc kubenswrapper[4776]: I1208 10:42:41.400011 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:43:11 crc kubenswrapper[4776]: I1208 10:43:11.398974 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:43:11 crc kubenswrapper[4776]: I1208 10:43:11.399529 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:43:11 crc kubenswrapper[4776]: I1208 10:43:11.399568 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 10:43:11 crc kubenswrapper[4776]: I1208 10:43:11.400060 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"204440af754cab96c5a4e55db9a243723d836f694200606bc30e8bb3cce0cb54"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 10:43:11 crc kubenswrapper[4776]: I1208 10:43:11.400107 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://204440af754cab96c5a4e55db9a243723d836f694200606bc30e8bb3cce0cb54" gracePeriod=600 Dec 08 10:43:12 crc kubenswrapper[4776]: I1208 10:43:12.442637 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="204440af754cab96c5a4e55db9a243723d836f694200606bc30e8bb3cce0cb54" exitCode=0 Dec 08 10:43:12 crc kubenswrapper[4776]: I1208 10:43:12.442713 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"204440af754cab96c5a4e55db9a243723d836f694200606bc30e8bb3cce0cb54"} Dec 08 10:43:12 crc kubenswrapper[4776]: I1208 10:43:12.443306 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b"} Dec 08 10:43:12 crc kubenswrapper[4776]: I1208 10:43:12.443335 4776 scope.go:117] "RemoveContainer" containerID="8c1c73971d87140e44cc9241fea6de13f5f6f0f3ea06fa74b8d3e01c0b4278b7" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.106961 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k6x5r/must-gather-cgltv"] Dec 08 10:43:50 crc kubenswrapper[4776]: E1208 10:43:50.108262 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda4600c-09ba-497d-a190-097fe0fa4a23" containerName="registry-server" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.108280 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda4600c-09ba-497d-a190-097fe0fa4a23" containerName="registry-server" Dec 08 10:43:50 crc kubenswrapper[4776]: E1208 10:43:50.108308 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda4600c-09ba-497d-a190-097fe0fa4a23" containerName="extract-utilities" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.108318 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda4600c-09ba-497d-a190-097fe0fa4a23" containerName="extract-utilities" Dec 08 10:43:50 crc kubenswrapper[4776]: E1208 10:43:50.108340 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda4600c-09ba-497d-a190-097fe0fa4a23" containerName="extract-content" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.108349 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda4600c-09ba-497d-a190-097fe0fa4a23" containerName="extract-content" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.115722 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="bda4600c-09ba-497d-a190-097fe0fa4a23" containerName="registry-server" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.118322 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k6x5r/must-gather-cgltv" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.125828 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-k6x5r"/"kube-root-ca.crt" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.126070 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-k6x5r"/"default-dockercfg-x4fwp" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.130537 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-k6x5r"/"openshift-service-ca.crt" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.135946 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k6x5r/must-gather-cgltv"] Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.304804 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlqxn\" (UniqueName: \"kubernetes.io/projected/69237141-2b8c-4aa8-9aeb-dfb1ad0b2187-kube-api-access-rlqxn\") pod \"must-gather-cgltv\" (UID: \"69237141-2b8c-4aa8-9aeb-dfb1ad0b2187\") " pod="openshift-must-gather-k6x5r/must-gather-cgltv" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.305369 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/69237141-2b8c-4aa8-9aeb-dfb1ad0b2187-must-gather-output\") pod \"must-gather-cgltv\" (UID: \"69237141-2b8c-4aa8-9aeb-dfb1ad0b2187\") " pod="openshift-must-gather-k6x5r/must-gather-cgltv" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.407881 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/69237141-2b8c-4aa8-9aeb-dfb1ad0b2187-must-gather-output\") pod \"must-gather-cgltv\" (UID: \"69237141-2b8c-4aa8-9aeb-dfb1ad0b2187\") " pod="openshift-must-gather-k6x5r/must-gather-cgltv" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.408001 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlqxn\" (UniqueName: \"kubernetes.io/projected/69237141-2b8c-4aa8-9aeb-dfb1ad0b2187-kube-api-access-rlqxn\") pod \"must-gather-cgltv\" (UID: \"69237141-2b8c-4aa8-9aeb-dfb1ad0b2187\") " pod="openshift-must-gather-k6x5r/must-gather-cgltv" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.408491 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/69237141-2b8c-4aa8-9aeb-dfb1ad0b2187-must-gather-output\") pod \"must-gather-cgltv\" (UID: \"69237141-2b8c-4aa8-9aeb-dfb1ad0b2187\") " pod="openshift-must-gather-k6x5r/must-gather-cgltv" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.428584 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlqxn\" (UniqueName: \"kubernetes.io/projected/69237141-2b8c-4aa8-9aeb-dfb1ad0b2187-kube-api-access-rlqxn\") pod \"must-gather-cgltv\" (UID: \"69237141-2b8c-4aa8-9aeb-dfb1ad0b2187\") " pod="openshift-must-gather-k6x5r/must-gather-cgltv" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.450920 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k6x5r/must-gather-cgltv" Dec 08 10:43:50 crc kubenswrapper[4776]: I1208 10:43:50.933811 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k6x5r/must-gather-cgltv"] Dec 08 10:43:51 crc kubenswrapper[4776]: I1208 10:43:51.894438 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k6x5r/must-gather-cgltv" event={"ID":"69237141-2b8c-4aa8-9aeb-dfb1ad0b2187","Type":"ContainerStarted","Data":"bb63c7c1c2dd86d0afb119fe94fa1d13246a09d3fc76cc3b49384a07a84b66ab"} Dec 08 10:43:51 crc kubenswrapper[4776]: I1208 10:43:51.894757 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k6x5r/must-gather-cgltv" event={"ID":"69237141-2b8c-4aa8-9aeb-dfb1ad0b2187","Type":"ContainerStarted","Data":"2abd8eb829d15f8361966c4b64085261560bbe6dd18bc78fbaa27eebef9fd5ff"} Dec 08 10:43:52 crc kubenswrapper[4776]: I1208 10:43:52.908521 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k6x5r/must-gather-cgltv" event={"ID":"69237141-2b8c-4aa8-9aeb-dfb1ad0b2187","Type":"ContainerStarted","Data":"670f2f07675a75150a4cf46b89a53b32fd045ac9d91ea1777da47ab1b1482509"} Dec 08 10:43:52 crc kubenswrapper[4776]: I1208 10:43:52.932657 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k6x5r/must-gather-cgltv" podStartSLOduration=2.932635226 podStartE2EDuration="2.932635226s" podCreationTimestamp="2025-12-08 10:43:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 10:43:52.927889105 +0000 UTC m=+6309.191114147" watchObservedRunningTime="2025-12-08 10:43:52.932635226 +0000 UTC m=+6309.195860258" Dec 08 10:43:55 crc kubenswrapper[4776]: I1208 10:43:55.260122 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k6x5r/crc-debug-pvm8x"] Dec 08 10:43:55 crc kubenswrapper[4776]: I1208 10:43:55.262814 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k6x5r/crc-debug-pvm8x" Dec 08 10:43:55 crc kubenswrapper[4776]: I1208 10:43:55.447579 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkcbw\" (UniqueName: \"kubernetes.io/projected/38c46efa-2948-424b-b318-8aa91d600a54-kube-api-access-qkcbw\") pod \"crc-debug-pvm8x\" (UID: \"38c46efa-2948-424b-b318-8aa91d600a54\") " pod="openshift-must-gather-k6x5r/crc-debug-pvm8x" Dec 08 10:43:55 crc kubenswrapper[4776]: I1208 10:43:55.448990 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38c46efa-2948-424b-b318-8aa91d600a54-host\") pod \"crc-debug-pvm8x\" (UID: \"38c46efa-2948-424b-b318-8aa91d600a54\") " pod="openshift-must-gather-k6x5r/crc-debug-pvm8x" Dec 08 10:43:55 crc kubenswrapper[4776]: I1208 10:43:55.552826 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkcbw\" (UniqueName: \"kubernetes.io/projected/38c46efa-2948-424b-b318-8aa91d600a54-kube-api-access-qkcbw\") pod \"crc-debug-pvm8x\" (UID: \"38c46efa-2948-424b-b318-8aa91d600a54\") " pod="openshift-must-gather-k6x5r/crc-debug-pvm8x" Dec 08 10:43:55 crc kubenswrapper[4776]: I1208 10:43:55.553674 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38c46efa-2948-424b-b318-8aa91d600a54-host\") pod \"crc-debug-pvm8x\" (UID: \"38c46efa-2948-424b-b318-8aa91d600a54\") " pod="openshift-must-gather-k6x5r/crc-debug-pvm8x" Dec 08 10:43:55 crc kubenswrapper[4776]: I1208 10:43:55.553774 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38c46efa-2948-424b-b318-8aa91d600a54-host\") pod \"crc-debug-pvm8x\" (UID: \"38c46efa-2948-424b-b318-8aa91d600a54\") " pod="openshift-must-gather-k6x5r/crc-debug-pvm8x" Dec 08 10:43:55 crc kubenswrapper[4776]: I1208 10:43:55.575291 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkcbw\" (UniqueName: \"kubernetes.io/projected/38c46efa-2948-424b-b318-8aa91d600a54-kube-api-access-qkcbw\") pod \"crc-debug-pvm8x\" (UID: \"38c46efa-2948-424b-b318-8aa91d600a54\") " pod="openshift-must-gather-k6x5r/crc-debug-pvm8x" Dec 08 10:43:55 crc kubenswrapper[4776]: I1208 10:43:55.586040 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k6x5r/crc-debug-pvm8x" Dec 08 10:43:55 crc kubenswrapper[4776]: W1208 10:43:55.618972 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38c46efa_2948_424b_b318_8aa91d600a54.slice/crio-1776b092d9b9a92bf9e9b80b222ce097f7a281926c339c1604cfe9cd6349fdaa WatchSource:0}: Error finding container 1776b092d9b9a92bf9e9b80b222ce097f7a281926c339c1604cfe9cd6349fdaa: Status 404 returned error can't find the container with id 1776b092d9b9a92bf9e9b80b222ce097f7a281926c339c1604cfe9cd6349fdaa Dec 08 10:43:55 crc kubenswrapper[4776]: I1208 10:43:55.944186 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k6x5r/crc-debug-pvm8x" event={"ID":"38c46efa-2948-424b-b318-8aa91d600a54","Type":"ContainerStarted","Data":"ccbf3c82de929a3357733dbe96de6d16da54bacf22b970a046fcf8d5e928112d"} Dec 08 10:43:55 crc kubenswrapper[4776]: I1208 10:43:55.944573 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k6x5r/crc-debug-pvm8x" event={"ID":"38c46efa-2948-424b-b318-8aa91d600a54","Type":"ContainerStarted","Data":"1776b092d9b9a92bf9e9b80b222ce097f7a281926c339c1604cfe9cd6349fdaa"} Dec 08 10:43:55 crc kubenswrapper[4776]: I1208 10:43:55.962697 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k6x5r/crc-debug-pvm8x" podStartSLOduration=0.962678485 podStartE2EDuration="962.678485ms" podCreationTimestamp="2025-12-08 10:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 10:43:55.960804844 +0000 UTC m=+6312.224029866" watchObservedRunningTime="2025-12-08 10:43:55.962678485 +0000 UTC m=+6312.225903507" Dec 08 10:44:39 crc kubenswrapper[4776]: I1208 10:44:39.416952 4776 generic.go:334] "Generic (PLEG): container finished" podID="38c46efa-2948-424b-b318-8aa91d600a54" containerID="ccbf3c82de929a3357733dbe96de6d16da54bacf22b970a046fcf8d5e928112d" exitCode=0 Dec 08 10:44:39 crc kubenswrapper[4776]: I1208 10:44:39.417119 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k6x5r/crc-debug-pvm8x" event={"ID":"38c46efa-2948-424b-b318-8aa91d600a54","Type":"ContainerDied","Data":"ccbf3c82de929a3357733dbe96de6d16da54bacf22b970a046fcf8d5e928112d"} Dec 08 10:44:40 crc kubenswrapper[4776]: I1208 10:44:40.540463 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k6x5r/crc-debug-pvm8x" Dec 08 10:44:40 crc kubenswrapper[4776]: I1208 10:44:40.570747 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkcbw\" (UniqueName: \"kubernetes.io/projected/38c46efa-2948-424b-b318-8aa91d600a54-kube-api-access-qkcbw\") pod \"38c46efa-2948-424b-b318-8aa91d600a54\" (UID: \"38c46efa-2948-424b-b318-8aa91d600a54\") " Dec 08 10:44:40 crc kubenswrapper[4776]: I1208 10:44:40.570870 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38c46efa-2948-424b-b318-8aa91d600a54-host\") pod \"38c46efa-2948-424b-b318-8aa91d600a54\" (UID: \"38c46efa-2948-424b-b318-8aa91d600a54\") " Dec 08 10:44:40 crc kubenswrapper[4776]: I1208 10:44:40.570990 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38c46efa-2948-424b-b318-8aa91d600a54-host" (OuterVolumeSpecName: "host") pod "38c46efa-2948-424b-b318-8aa91d600a54" (UID: "38c46efa-2948-424b-b318-8aa91d600a54"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 10:44:40 crc kubenswrapper[4776]: I1208 10:44:40.571601 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38c46efa-2948-424b-b318-8aa91d600a54-host\") on node \"crc\" DevicePath \"\"" Dec 08 10:44:40 crc kubenswrapper[4776]: I1208 10:44:40.577262 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38c46efa-2948-424b-b318-8aa91d600a54-kube-api-access-qkcbw" (OuterVolumeSpecName: "kube-api-access-qkcbw") pod "38c46efa-2948-424b-b318-8aa91d600a54" (UID: "38c46efa-2948-424b-b318-8aa91d600a54"). InnerVolumeSpecName "kube-api-access-qkcbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:44:40 crc kubenswrapper[4776]: I1208 10:44:40.577790 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k6x5r/crc-debug-pvm8x"] Dec 08 10:44:40 crc kubenswrapper[4776]: I1208 10:44:40.587732 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k6x5r/crc-debug-pvm8x"] Dec 08 10:44:40 crc kubenswrapper[4776]: I1208 10:44:40.674008 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkcbw\" (UniqueName: \"kubernetes.io/projected/38c46efa-2948-424b-b318-8aa91d600a54-kube-api-access-qkcbw\") on node \"crc\" DevicePath \"\"" Dec 08 10:44:41 crc kubenswrapper[4776]: I1208 10:44:41.438746 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1776b092d9b9a92bf9e9b80b222ce097f7a281926c339c1604cfe9cd6349fdaa" Dec 08 10:44:41 crc kubenswrapper[4776]: I1208 10:44:41.438913 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k6x5r/crc-debug-pvm8x" Dec 08 10:44:41 crc kubenswrapper[4776]: I1208 10:44:41.764664 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k6x5r/crc-debug-8k2vp"] Dec 08 10:44:41 crc kubenswrapper[4776]: E1208 10:44:41.765126 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38c46efa-2948-424b-b318-8aa91d600a54" containerName="container-00" Dec 08 10:44:41 crc kubenswrapper[4776]: I1208 10:44:41.765138 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="38c46efa-2948-424b-b318-8aa91d600a54" containerName="container-00" Dec 08 10:44:41 crc kubenswrapper[4776]: I1208 10:44:41.769692 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="38c46efa-2948-424b-b318-8aa91d600a54" containerName="container-00" Dec 08 10:44:41 crc kubenswrapper[4776]: I1208 10:44:41.770513 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k6x5r/crc-debug-8k2vp" Dec 08 10:44:41 crc kubenswrapper[4776]: I1208 10:44:41.797239 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2f8250b-db39-4cba-984f-e1299c08b022-host\") pod \"crc-debug-8k2vp\" (UID: \"a2f8250b-db39-4cba-984f-e1299c08b022\") " pod="openshift-must-gather-k6x5r/crc-debug-8k2vp" Dec 08 10:44:41 crc kubenswrapper[4776]: I1208 10:44:41.797284 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drr7v\" (UniqueName: \"kubernetes.io/projected/a2f8250b-db39-4cba-984f-e1299c08b022-kube-api-access-drr7v\") pod \"crc-debug-8k2vp\" (UID: \"a2f8250b-db39-4cba-984f-e1299c08b022\") " pod="openshift-must-gather-k6x5r/crc-debug-8k2vp" Dec 08 10:44:41 crc kubenswrapper[4776]: I1208 10:44:41.899464 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2f8250b-db39-4cba-984f-e1299c08b022-host\") pod \"crc-debug-8k2vp\" (UID: \"a2f8250b-db39-4cba-984f-e1299c08b022\") " pod="openshift-must-gather-k6x5r/crc-debug-8k2vp" Dec 08 10:44:41 crc kubenswrapper[4776]: I1208 10:44:41.899503 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drr7v\" (UniqueName: \"kubernetes.io/projected/a2f8250b-db39-4cba-984f-e1299c08b022-kube-api-access-drr7v\") pod \"crc-debug-8k2vp\" (UID: \"a2f8250b-db39-4cba-984f-e1299c08b022\") " pod="openshift-must-gather-k6x5r/crc-debug-8k2vp" Dec 08 10:44:41 crc kubenswrapper[4776]: I1208 10:44:41.899591 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2f8250b-db39-4cba-984f-e1299c08b022-host\") pod \"crc-debug-8k2vp\" (UID: \"a2f8250b-db39-4cba-984f-e1299c08b022\") " pod="openshift-must-gather-k6x5r/crc-debug-8k2vp" Dec 08 10:44:41 crc kubenswrapper[4776]: I1208 10:44:41.924224 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drr7v\" (UniqueName: \"kubernetes.io/projected/a2f8250b-db39-4cba-984f-e1299c08b022-kube-api-access-drr7v\") pod \"crc-debug-8k2vp\" (UID: \"a2f8250b-db39-4cba-984f-e1299c08b022\") " pod="openshift-must-gather-k6x5r/crc-debug-8k2vp" Dec 08 10:44:42 crc kubenswrapper[4776]: I1208 10:44:42.087652 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k6x5r/crc-debug-8k2vp" Dec 08 10:44:42 crc kubenswrapper[4776]: I1208 10:44:42.359506 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38c46efa-2948-424b-b318-8aa91d600a54" path="/var/lib/kubelet/pods/38c46efa-2948-424b-b318-8aa91d600a54/volumes" Dec 08 10:44:42 crc kubenswrapper[4776]: I1208 10:44:42.450530 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k6x5r/crc-debug-8k2vp" event={"ID":"a2f8250b-db39-4cba-984f-e1299c08b022","Type":"ContainerStarted","Data":"f8bd2f26b2d7f29e22240541c13505dbde6cc141a12009d4da7c61734a098b0b"} Dec 08 10:44:42 crc kubenswrapper[4776]: I1208 10:44:42.450583 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k6x5r/crc-debug-8k2vp" event={"ID":"a2f8250b-db39-4cba-984f-e1299c08b022","Type":"ContainerStarted","Data":"7bab71c6327ebffcf35aa3f26234bd729eb39da3558a8e0e286ad25153678fd5"} Dec 08 10:44:42 crc kubenswrapper[4776]: I1208 10:44:42.474498 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k6x5r/crc-debug-8k2vp" podStartSLOduration=1.474476277 podStartE2EDuration="1.474476277s" podCreationTimestamp="2025-12-08 10:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 10:44:42.466596162 +0000 UTC m=+6358.729821194" watchObservedRunningTime="2025-12-08 10:44:42.474476277 +0000 UTC m=+6358.737701299" Dec 08 10:44:43 crc kubenswrapper[4776]: I1208 10:44:43.461857 4776 generic.go:334] "Generic (PLEG): container finished" podID="a2f8250b-db39-4cba-984f-e1299c08b022" containerID="f8bd2f26b2d7f29e22240541c13505dbde6cc141a12009d4da7c61734a098b0b" exitCode=0 Dec 08 10:44:43 crc kubenswrapper[4776]: I1208 10:44:43.462214 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k6x5r/crc-debug-8k2vp" event={"ID":"a2f8250b-db39-4cba-984f-e1299c08b022","Type":"ContainerDied","Data":"f8bd2f26b2d7f29e22240541c13505dbde6cc141a12009d4da7c61734a098b0b"} Dec 08 10:44:44 crc kubenswrapper[4776]: I1208 10:44:44.605137 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k6x5r/crc-debug-8k2vp" Dec 08 10:44:44 crc kubenswrapper[4776]: I1208 10:44:44.660908 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drr7v\" (UniqueName: \"kubernetes.io/projected/a2f8250b-db39-4cba-984f-e1299c08b022-kube-api-access-drr7v\") pod \"a2f8250b-db39-4cba-984f-e1299c08b022\" (UID: \"a2f8250b-db39-4cba-984f-e1299c08b022\") " Dec 08 10:44:44 crc kubenswrapper[4776]: I1208 10:44:44.661144 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2f8250b-db39-4cba-984f-e1299c08b022-host\") pod \"a2f8250b-db39-4cba-984f-e1299c08b022\" (UID: \"a2f8250b-db39-4cba-984f-e1299c08b022\") " Dec 08 10:44:44 crc kubenswrapper[4776]: I1208 10:44:44.661377 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2f8250b-db39-4cba-984f-e1299c08b022-host" (OuterVolumeSpecName: "host") pod "a2f8250b-db39-4cba-984f-e1299c08b022" (UID: "a2f8250b-db39-4cba-984f-e1299c08b022"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 10:44:44 crc kubenswrapper[4776]: I1208 10:44:44.661992 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2f8250b-db39-4cba-984f-e1299c08b022-host\") on node \"crc\" DevicePath \"\"" Dec 08 10:44:44 crc kubenswrapper[4776]: I1208 10:44:44.668201 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2f8250b-db39-4cba-984f-e1299c08b022-kube-api-access-drr7v" (OuterVolumeSpecName: "kube-api-access-drr7v") pod "a2f8250b-db39-4cba-984f-e1299c08b022" (UID: "a2f8250b-db39-4cba-984f-e1299c08b022"). InnerVolumeSpecName "kube-api-access-drr7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:44:44 crc kubenswrapper[4776]: I1208 10:44:44.764877 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drr7v\" (UniqueName: \"kubernetes.io/projected/a2f8250b-db39-4cba-984f-e1299c08b022-kube-api-access-drr7v\") on node \"crc\" DevicePath \"\"" Dec 08 10:44:45 crc kubenswrapper[4776]: I1208 10:44:45.009027 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k6x5r/crc-debug-8k2vp"] Dec 08 10:44:45 crc kubenswrapper[4776]: I1208 10:44:45.019907 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k6x5r/crc-debug-8k2vp"] Dec 08 10:44:45 crc kubenswrapper[4776]: I1208 10:44:45.483230 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bab71c6327ebffcf35aa3f26234bd729eb39da3558a8e0e286ad25153678fd5" Dec 08 10:44:45 crc kubenswrapper[4776]: I1208 10:44:45.483299 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k6x5r/crc-debug-8k2vp" Dec 08 10:44:46 crc kubenswrapper[4776]: I1208 10:44:46.233803 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k6x5r/crc-debug-rm65m"] Dec 08 10:44:46 crc kubenswrapper[4776]: E1208 10:44:46.234586 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2f8250b-db39-4cba-984f-e1299c08b022" containerName="container-00" Dec 08 10:44:46 crc kubenswrapper[4776]: I1208 10:44:46.234598 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2f8250b-db39-4cba-984f-e1299c08b022" containerName="container-00" Dec 08 10:44:46 crc kubenswrapper[4776]: I1208 10:44:46.234836 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2f8250b-db39-4cba-984f-e1299c08b022" containerName="container-00" Dec 08 10:44:46 crc kubenswrapper[4776]: I1208 10:44:46.235833 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k6x5r/crc-debug-rm65m" Dec 08 10:44:46 crc kubenswrapper[4776]: I1208 10:44:46.304188 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e62891b-2782-4af4-9e02-cbcdccbb182f-host\") pod \"crc-debug-rm65m\" (UID: \"7e62891b-2782-4af4-9e02-cbcdccbb182f\") " pod="openshift-must-gather-k6x5r/crc-debug-rm65m" Dec 08 10:44:46 crc kubenswrapper[4776]: I1208 10:44:46.304364 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d55v\" (UniqueName: \"kubernetes.io/projected/7e62891b-2782-4af4-9e02-cbcdccbb182f-kube-api-access-4d55v\") pod \"crc-debug-rm65m\" (UID: \"7e62891b-2782-4af4-9e02-cbcdccbb182f\") " pod="openshift-must-gather-k6x5r/crc-debug-rm65m" Dec 08 10:44:46 crc kubenswrapper[4776]: I1208 10:44:46.355543 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2f8250b-db39-4cba-984f-e1299c08b022" path="/var/lib/kubelet/pods/a2f8250b-db39-4cba-984f-e1299c08b022/volumes" Dec 08 10:44:46 crc kubenswrapper[4776]: I1208 10:44:46.406101 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e62891b-2782-4af4-9e02-cbcdccbb182f-host\") pod \"crc-debug-rm65m\" (UID: \"7e62891b-2782-4af4-9e02-cbcdccbb182f\") " pod="openshift-must-gather-k6x5r/crc-debug-rm65m" Dec 08 10:44:46 crc kubenswrapper[4776]: I1208 10:44:46.406257 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d55v\" (UniqueName: \"kubernetes.io/projected/7e62891b-2782-4af4-9e02-cbcdccbb182f-kube-api-access-4d55v\") pod \"crc-debug-rm65m\" (UID: \"7e62891b-2782-4af4-9e02-cbcdccbb182f\") " pod="openshift-must-gather-k6x5r/crc-debug-rm65m" Dec 08 10:44:46 crc kubenswrapper[4776]: I1208 10:44:46.406941 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e62891b-2782-4af4-9e02-cbcdccbb182f-host\") pod \"crc-debug-rm65m\" (UID: \"7e62891b-2782-4af4-9e02-cbcdccbb182f\") " pod="openshift-must-gather-k6x5r/crc-debug-rm65m" Dec 08 10:44:46 crc kubenswrapper[4776]: I1208 10:44:46.427936 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d55v\" (UniqueName: \"kubernetes.io/projected/7e62891b-2782-4af4-9e02-cbcdccbb182f-kube-api-access-4d55v\") pod \"crc-debug-rm65m\" (UID: \"7e62891b-2782-4af4-9e02-cbcdccbb182f\") " pod="openshift-must-gather-k6x5r/crc-debug-rm65m" Dec 08 10:44:46 crc kubenswrapper[4776]: I1208 10:44:46.553605 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k6x5r/crc-debug-rm65m" Dec 08 10:44:47 crc kubenswrapper[4776]: I1208 10:44:47.505683 4776 generic.go:334] "Generic (PLEG): container finished" podID="7e62891b-2782-4af4-9e02-cbcdccbb182f" containerID="8312f59515f38a5f342bd9cec85eda307ce7b3042d1c10aa43e44839f705bbf3" exitCode=0 Dec 08 10:44:47 crc kubenswrapper[4776]: I1208 10:44:47.505844 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k6x5r/crc-debug-rm65m" event={"ID":"7e62891b-2782-4af4-9e02-cbcdccbb182f","Type":"ContainerDied","Data":"8312f59515f38a5f342bd9cec85eda307ce7b3042d1c10aa43e44839f705bbf3"} Dec 08 10:44:47 crc kubenswrapper[4776]: I1208 10:44:47.506310 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k6x5r/crc-debug-rm65m" event={"ID":"7e62891b-2782-4af4-9e02-cbcdccbb182f","Type":"ContainerStarted","Data":"6119b22da0720cc9d9f13d31c268af50127c50b05562630584953dd9276596e5"} Dec 08 10:44:47 crc kubenswrapper[4776]: I1208 10:44:47.547848 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k6x5r/crc-debug-rm65m"] Dec 08 10:44:47 crc kubenswrapper[4776]: I1208 10:44:47.562290 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k6x5r/crc-debug-rm65m"] Dec 08 10:44:48 crc kubenswrapper[4776]: I1208 10:44:48.656858 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k6x5r/crc-debug-rm65m" Dec 08 10:44:48 crc kubenswrapper[4776]: I1208 10:44:48.752712 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e62891b-2782-4af4-9e02-cbcdccbb182f-host\") pod \"7e62891b-2782-4af4-9e02-cbcdccbb182f\" (UID: \"7e62891b-2782-4af4-9e02-cbcdccbb182f\") " Dec 08 10:44:48 crc kubenswrapper[4776]: I1208 10:44:48.752844 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e62891b-2782-4af4-9e02-cbcdccbb182f-host" (OuterVolumeSpecName: "host") pod "7e62891b-2782-4af4-9e02-cbcdccbb182f" (UID: "7e62891b-2782-4af4-9e02-cbcdccbb182f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 10:44:48 crc kubenswrapper[4776]: I1208 10:44:48.752867 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d55v\" (UniqueName: \"kubernetes.io/projected/7e62891b-2782-4af4-9e02-cbcdccbb182f-kube-api-access-4d55v\") pod \"7e62891b-2782-4af4-9e02-cbcdccbb182f\" (UID: \"7e62891b-2782-4af4-9e02-cbcdccbb182f\") " Dec 08 10:44:48 crc kubenswrapper[4776]: I1208 10:44:48.753859 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e62891b-2782-4af4-9e02-cbcdccbb182f-host\") on node \"crc\" DevicePath \"\"" Dec 08 10:44:48 crc kubenswrapper[4776]: I1208 10:44:48.761512 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e62891b-2782-4af4-9e02-cbcdccbb182f-kube-api-access-4d55v" (OuterVolumeSpecName: "kube-api-access-4d55v") pod "7e62891b-2782-4af4-9e02-cbcdccbb182f" (UID: "7e62891b-2782-4af4-9e02-cbcdccbb182f"). InnerVolumeSpecName "kube-api-access-4d55v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:44:48 crc kubenswrapper[4776]: I1208 10:44:48.855795 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d55v\" (UniqueName: \"kubernetes.io/projected/7e62891b-2782-4af4-9e02-cbcdccbb182f-kube-api-access-4d55v\") on node \"crc\" DevicePath \"\"" Dec 08 10:44:49 crc kubenswrapper[4776]: I1208 10:44:49.529432 4776 scope.go:117] "RemoveContainer" containerID="8312f59515f38a5f342bd9cec85eda307ce7b3042d1c10aa43e44839f705bbf3" Dec 08 10:44:49 crc kubenswrapper[4776]: I1208 10:44:49.529456 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k6x5r/crc-debug-rm65m" Dec 08 10:44:50 crc kubenswrapper[4776]: I1208 10:44:50.358355 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e62891b-2782-4af4-9e02-cbcdccbb182f" path="/var/lib/kubelet/pods/7e62891b-2782-4af4-9e02-cbcdccbb182f/volumes" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.157629 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj"] Dec 08 10:45:00 crc kubenswrapper[4776]: E1208 10:45:00.159055 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e62891b-2782-4af4-9e02-cbcdccbb182f" containerName="container-00" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.159073 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e62891b-2782-4af4-9e02-cbcdccbb182f" containerName="container-00" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.159447 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e62891b-2782-4af4-9e02-cbcdccbb182f" containerName="container-00" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.160592 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.164642 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.164860 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.169461 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj"] Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.209018 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-config-volume\") pod \"collect-profiles-29419845-7hswj\" (UID: \"2ec2fdc2-b1f0-41c6-b462-263f05f3143c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.209641 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85mvl\" (UniqueName: \"kubernetes.io/projected/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-kube-api-access-85mvl\") pod \"collect-profiles-29419845-7hswj\" (UID: \"2ec2fdc2-b1f0-41c6-b462-263f05f3143c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.210120 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-secret-volume\") pod \"collect-profiles-29419845-7hswj\" (UID: \"2ec2fdc2-b1f0-41c6-b462-263f05f3143c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.312768 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-secret-volume\") pod \"collect-profiles-29419845-7hswj\" (UID: \"2ec2fdc2-b1f0-41c6-b462-263f05f3143c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.312897 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-config-volume\") pod \"collect-profiles-29419845-7hswj\" (UID: \"2ec2fdc2-b1f0-41c6-b462-263f05f3143c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.312988 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85mvl\" (UniqueName: \"kubernetes.io/projected/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-kube-api-access-85mvl\") pod \"collect-profiles-29419845-7hswj\" (UID: \"2ec2fdc2-b1f0-41c6-b462-263f05f3143c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.313752 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-config-volume\") pod \"collect-profiles-29419845-7hswj\" (UID: \"2ec2fdc2-b1f0-41c6-b462-263f05f3143c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.319044 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-secret-volume\") pod \"collect-profiles-29419845-7hswj\" (UID: \"2ec2fdc2-b1f0-41c6-b462-263f05f3143c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.328591 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85mvl\" (UniqueName: \"kubernetes.io/projected/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-kube-api-access-85mvl\") pod \"collect-profiles-29419845-7hswj\" (UID: \"2ec2fdc2-b1f0-41c6-b462-263f05f3143c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.485073 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj" Dec 08 10:45:00 crc kubenswrapper[4776]: I1208 10:45:00.945225 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj"] Dec 08 10:45:01 crc kubenswrapper[4776]: I1208 10:45:01.707539 4776 generic.go:334] "Generic (PLEG): container finished" podID="2ec2fdc2-b1f0-41c6-b462-263f05f3143c" containerID="76b8eef0e3d3b39e8cdd50714b7a0bcc7d90e6a4d7a53b6be37b4ff83a42feb5" exitCode=0 Dec 08 10:45:01 crc kubenswrapper[4776]: I1208 10:45:01.707594 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj" event={"ID":"2ec2fdc2-b1f0-41c6-b462-263f05f3143c","Type":"ContainerDied","Data":"76b8eef0e3d3b39e8cdd50714b7a0bcc7d90e6a4d7a53b6be37b4ff83a42feb5"} Dec 08 10:45:01 crc kubenswrapper[4776]: I1208 10:45:01.707812 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj" event={"ID":"2ec2fdc2-b1f0-41c6-b462-263f05f3143c","Type":"ContainerStarted","Data":"af55b3a477d0a4b6612a338e198d0ef0da033a6416b971ede86856eb23af3d85"} Dec 08 10:45:03 crc kubenswrapper[4776]: I1208 10:45:03.165119 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj" Dec 08 10:45:03 crc kubenswrapper[4776]: I1208 10:45:03.278987 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-secret-volume\") pod \"2ec2fdc2-b1f0-41c6-b462-263f05f3143c\" (UID: \"2ec2fdc2-b1f0-41c6-b462-263f05f3143c\") " Dec 08 10:45:03 crc kubenswrapper[4776]: I1208 10:45:03.279060 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-config-volume\") pod \"2ec2fdc2-b1f0-41c6-b462-263f05f3143c\" (UID: \"2ec2fdc2-b1f0-41c6-b462-263f05f3143c\") " Dec 08 10:45:03 crc kubenswrapper[4776]: I1208 10:45:03.279119 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85mvl\" (UniqueName: \"kubernetes.io/projected/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-kube-api-access-85mvl\") pod \"2ec2fdc2-b1f0-41c6-b462-263f05f3143c\" (UID: \"2ec2fdc2-b1f0-41c6-b462-263f05f3143c\") " Dec 08 10:45:03 crc kubenswrapper[4776]: I1208 10:45:03.279862 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-config-volume" (OuterVolumeSpecName: "config-volume") pod "2ec2fdc2-b1f0-41c6-b462-263f05f3143c" (UID: "2ec2fdc2-b1f0-41c6-b462-263f05f3143c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 10:45:03 crc kubenswrapper[4776]: I1208 10:45:03.285351 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2ec2fdc2-b1f0-41c6-b462-263f05f3143c" (UID: "2ec2fdc2-b1f0-41c6-b462-263f05f3143c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 10:45:03 crc kubenswrapper[4776]: I1208 10:45:03.287403 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-kube-api-access-85mvl" (OuterVolumeSpecName: "kube-api-access-85mvl") pod "2ec2fdc2-b1f0-41c6-b462-263f05f3143c" (UID: "2ec2fdc2-b1f0-41c6-b462-263f05f3143c"). InnerVolumeSpecName "kube-api-access-85mvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:45:03 crc kubenswrapper[4776]: I1208 10:45:03.381685 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 08 10:45:03 crc kubenswrapper[4776]: I1208 10:45:03.381718 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 08 10:45:03 crc kubenswrapper[4776]: I1208 10:45:03.381729 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85mvl\" (UniqueName: \"kubernetes.io/projected/2ec2fdc2-b1f0-41c6-b462-263f05f3143c-kube-api-access-85mvl\") on node \"crc\" DevicePath \"\"" Dec 08 10:45:03 crc kubenswrapper[4776]: I1208 10:45:03.730364 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj" event={"ID":"2ec2fdc2-b1f0-41c6-b462-263f05f3143c","Type":"ContainerDied","Data":"af55b3a477d0a4b6612a338e198d0ef0da033a6416b971ede86856eb23af3d85"} Dec 08 10:45:03 crc kubenswrapper[4776]: I1208 10:45:03.730406 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af55b3a477d0a4b6612a338e198d0ef0da033a6416b971ede86856eb23af3d85" Dec 08 10:45:03 crc kubenswrapper[4776]: I1208 10:45:03.730414 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29419845-7hswj" Dec 08 10:45:04 crc kubenswrapper[4776]: I1208 10:45:04.244574 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8"] Dec 08 10:45:04 crc kubenswrapper[4776]: I1208 10:45:04.254759 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29419800-767t8"] Dec 08 10:45:04 crc kubenswrapper[4776]: I1208 10:45:04.356267 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e82e989b-1b77-4703-a7e5-3e1eb29825e4" path="/var/lib/kubelet/pods/e82e989b-1b77-4703-a7e5-3e1eb29825e4/volumes" Dec 08 10:45:11 crc kubenswrapper[4776]: I1208 10:45:11.398888 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:45:11 crc kubenswrapper[4776]: I1208 10:45:11.399497 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:45:19 crc kubenswrapper[4776]: I1208 10:45:19.895055 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a024b29b-cad1-489c-88ea-efc9558b2da0/aodh-api/0.log" Dec 08 10:45:20 crc kubenswrapper[4776]: I1208 10:45:20.093868 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a024b29b-cad1-489c-88ea-efc9558b2da0/aodh-evaluator/0.log" Dec 08 10:45:20 crc kubenswrapper[4776]: I1208 10:45:20.102124 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a024b29b-cad1-489c-88ea-efc9558b2da0/aodh-listener/0.log" Dec 08 10:45:20 crc kubenswrapper[4776]: I1208 10:45:20.160762 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a024b29b-cad1-489c-88ea-efc9558b2da0/aodh-notifier/0.log" Dec 08 10:45:20 crc kubenswrapper[4776]: I1208 10:45:20.284960 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d7dd8bd8b-9z2p2_71c29885-fdf1-4500-bee7-2b4102fb2c7e/barbican-api/0.log" Dec 08 10:45:20 crc kubenswrapper[4776]: I1208 10:45:20.286949 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d7dd8bd8b-9z2p2_71c29885-fdf1-4500-bee7-2b4102fb2c7e/barbican-api-log/0.log" Dec 08 10:45:20 crc kubenswrapper[4776]: I1208 10:45:20.529122 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78d78f694b-ck9wf_e501058f-25e0-456c-b23d-c7caafa729c3/barbican-keystone-listener/0.log" Dec 08 10:45:20 crc kubenswrapper[4776]: I1208 10:45:20.582274 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-664c575c59-ncvpr_d6258a3d-a50e-4cf4-af4d-e6f588d8744a/barbican-worker/0.log" Dec 08 10:45:20 crc kubenswrapper[4776]: I1208 10:45:20.590973 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78d78f694b-ck9wf_e501058f-25e0-456c-b23d-c7caafa729c3/barbican-keystone-listener-log/0.log" Dec 08 10:45:20 crc kubenswrapper[4776]: I1208 10:45:20.728455 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-664c575c59-ncvpr_d6258a3d-a50e-4cf4-af4d-e6f588d8744a/barbican-worker-log/0.log" Dec 08 10:45:20 crc kubenswrapper[4776]: I1208 10:45:20.780590 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-mncj4_2304e249-86bc-4b0a-a222-e8c2ba39a0bb/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:20 crc kubenswrapper[4776]: I1208 10:45:20.896437 4776 scope.go:117] "RemoveContainer" containerID="af1975d8e1236c6cc084e3f3a5afa67b939d1e6444f7e62cd400b5be35dfe1a9" Dec 08 10:45:21 crc kubenswrapper[4776]: I1208 10:45:21.034077 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7cf1c3e-6789-4ccd-894c-946f056f2d96/ceilometer-notification-agent/0.log" Dec 08 10:45:21 crc kubenswrapper[4776]: I1208 10:45:21.083877 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7cf1c3e-6789-4ccd-894c-946f056f2d96/ceilometer-central-agent/0.log" Dec 08 10:45:21 crc kubenswrapper[4776]: I1208 10:45:21.105707 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7cf1c3e-6789-4ccd-894c-946f056f2d96/proxy-httpd/0.log" Dec 08 10:45:21 crc kubenswrapper[4776]: I1208 10:45:21.174383 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7cf1c3e-6789-4ccd-894c-946f056f2d96/sg-core/0.log" Dec 08 10:45:21 crc kubenswrapper[4776]: I1208 10:45:21.314351 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dcb1d701-bc05-4d4b-8794-ebc4af6da8ba/cinder-api-log/0.log" Dec 08 10:45:21 crc kubenswrapper[4776]: I1208 10:45:21.380074 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dcb1d701-bc05-4d4b-8794-ebc4af6da8ba/cinder-api/0.log" Dec 08 10:45:21 crc kubenswrapper[4776]: I1208 10:45:21.552968 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_289a9d84-e76a-42e5-9524-7e9b244b8743/cinder-scheduler/0.log" Dec 08 10:45:21 crc kubenswrapper[4776]: I1208 10:45:21.576839 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_289a9d84-e76a-42e5-9524-7e9b244b8743/probe/0.log" Dec 08 10:45:21 crc kubenswrapper[4776]: I1208 10:45:21.683705 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-kzwn2_6367602c-669d-474f-bd56-97c1b58659b4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:21 crc kubenswrapper[4776]: I1208 10:45:21.818765 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dhvs7_c92123e3-056d-4e4f-83b1-3cf335342a70/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:21 crc kubenswrapper[4776]: I1208 10:45:21.936303 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-zwnwt_569f45d2-4634-4246-873e-939ec98a0baf/init/0.log" Dec 08 10:45:22 crc kubenswrapper[4776]: I1208 10:45:22.109436 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-zwnwt_569f45d2-4634-4246-873e-939ec98a0baf/init/0.log" Dec 08 10:45:22 crc kubenswrapper[4776]: I1208 10:45:22.184733 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-75cmb_7af7dfaf-3db0-4c5d-b7fc-671893276afc/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:22 crc kubenswrapper[4776]: I1208 10:45:22.219799 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-zwnwt_569f45d2-4634-4246-873e-939ec98a0baf/dnsmasq-dns/0.log" Dec 08 10:45:22 crc kubenswrapper[4776]: I1208 10:45:22.431479 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8f9758b1-4ae1-47ae-8a45-14b0df4c8632/glance-log/0.log" Dec 08 10:45:22 crc kubenswrapper[4776]: I1208 10:45:22.439685 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8f9758b1-4ae1-47ae-8a45-14b0df4c8632/glance-httpd/0.log" Dec 08 10:45:22 crc kubenswrapper[4776]: I1208 10:45:22.628658 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_28ffab6e-5596-4c63-b58a-4417489fc47b/glance-httpd/0.log" Dec 08 10:45:22 crc kubenswrapper[4776]: I1208 10:45:22.663902 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_28ffab6e-5596-4c63-b58a-4417489fc47b/glance-log/0.log" Dec 08 10:45:23 crc kubenswrapper[4776]: I1208 10:45:23.481637 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-589b85487f-7v8kk_5c891ff5-fbcf-46b6-bace-6ef62df3c0b9/heat-engine/0.log" Dec 08 10:45:23 crc kubenswrapper[4776]: I1208 10:45:23.718072 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-55fd4bf697-njsxk_b7f47153-4e65-48b9-816d-4c83b0b0d8a4/heat-api/0.log" Dec 08 10:45:23 crc kubenswrapper[4776]: I1208 10:45:23.725209 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-sr4ml_7527bd54-54ba-42e5-9ec0-7037536864b9/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:23 crc kubenswrapper[4776]: I1208 10:45:23.906477 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-5fdf94c698-qz6j8_5f5f6ae8-0ed6-49ee-afe5-ab6fd356b4e4/heat-cfnapi/0.log" Dec 08 10:45:23 crc kubenswrapper[4776]: I1208 10:45:23.933358 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-tknt7_f67c7d60-bc4d-4712-a8d9-acb48e097264/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:24 crc kubenswrapper[4776]: I1208 10:45:24.177545 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29419801-s7tdk_53d2eccd-b6c4-4870-9ca0-f43dc8f0ce8e/keystone-cron/0.log" Dec 08 10:45:24 crc kubenswrapper[4776]: I1208 10:45:24.302096 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_90e6d887-db3e-40c6-9411-0e2565e5994d/kube-state-metrics/0.log" Dec 08 10:45:24 crc kubenswrapper[4776]: I1208 10:45:24.370487 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-77496dd4f7-8gxmg_6207b5a9-d7b8-4302-876c-c2a84bb352a1/keystone-api/0.log" Dec 08 10:45:24 crc kubenswrapper[4776]: I1208 10:45:24.450304 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-28blw_3933dc31-4df5-46ec-8fe0-62b9771c5515/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:24 crc kubenswrapper[4776]: I1208 10:45:24.519883 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-7kf4c_4e957285-89ac-4a08-a5f9-a3199e19b787/logging-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:24 crc kubenswrapper[4776]: I1208 10:45:24.727125 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_4cce6b19-9d40-4957-8154-b4d3a50fe2f7/mysqld-exporter/0.log" Dec 08 10:45:25 crc kubenswrapper[4776]: I1208 10:45:25.031087 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-d54mt_30f7ff02-8887-44e7-a223-335cd93255ef/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:25 crc kubenswrapper[4776]: I1208 10:45:25.128490 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5fd69d7-r446k_f84e2e46-bb9f-4b55-afd1-683f365c5417/neutron-httpd/0.log" Dec 08 10:45:25 crc kubenswrapper[4776]: I1208 10:45:25.156994 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5fd69d7-r446k_f84e2e46-bb9f-4b55-afd1-683f365c5417/neutron-api/0.log" Dec 08 10:45:25 crc kubenswrapper[4776]: I1208 10:45:25.797532 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ffbcb3b3-c4d6-461f-bae8-c1ae2de20050/nova-cell0-conductor-conductor/0.log" Dec 08 10:45:26 crc kubenswrapper[4776]: I1208 10:45:26.091816 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_56c71de4-c00f-47d6-87d7-c5eb97b88eef/nova-api-log/0.log" Dec 08 10:45:26 crc kubenswrapper[4776]: I1208 10:45:26.167522 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9791ac59-89ef-4429-b797-d89d7ce62024/nova-cell1-conductor-conductor/0.log" Dec 08 10:45:26 crc kubenswrapper[4776]: I1208 10:45:26.406871 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f485895b-f2aa-427f-b592-811f09089a49/nova-cell1-novncproxy-novncproxy/0.log" Dec 08 10:45:26 crc kubenswrapper[4776]: I1208 10:45:26.426158 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-tnlnp_9cd841cc-611f-406b-b9d5-8c242c1321ba/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:26 crc kubenswrapper[4776]: I1208 10:45:26.663610 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_56c71de4-c00f-47d6-87d7-c5eb97b88eef/nova-api-api/0.log" Dec 08 10:45:26 crc kubenswrapper[4776]: I1208 10:45:26.729474 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e6c2fb50-f70b-43cc-a493-b4ffa4292c64/nova-metadata-log/0.log" Dec 08 10:45:27 crc kubenswrapper[4776]: I1208 10:45:27.081939 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_425d947a-2a85-4a03-853f-a60f54515a57/mysql-bootstrap/0.log" Dec 08 10:45:27 crc kubenswrapper[4776]: I1208 10:45:27.136192 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e7651697-0db7-476f-8b50-1f04771b4ed2/nova-scheduler-scheduler/0.log" Dec 08 10:45:27 crc kubenswrapper[4776]: I1208 10:45:27.286705 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_425d947a-2a85-4a03-853f-a60f54515a57/galera/0.log" Dec 08 10:45:27 crc kubenswrapper[4776]: I1208 10:45:27.311563 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_425d947a-2a85-4a03-853f-a60f54515a57/mysql-bootstrap/0.log" Dec 08 10:45:27 crc kubenswrapper[4776]: I1208 10:45:27.614338 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7df4120e-0e93-4000-8b6a-7823f3e89dac/mysql-bootstrap/0.log" Dec 08 10:45:27 crc kubenswrapper[4776]: I1208 10:45:27.907062 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7df4120e-0e93-4000-8b6a-7823f3e89dac/mysql-bootstrap/0.log" Dec 08 10:45:27 crc kubenswrapper[4776]: I1208 10:45:27.965323 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7df4120e-0e93-4000-8b6a-7823f3e89dac/galera/0.log" Dec 08 10:45:28 crc kubenswrapper[4776]: I1208 10:45:28.135986 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8606b034-7364-4dce-bea0-7c0e2067ee95/openstackclient/0.log" Dec 08 10:45:28 crc kubenswrapper[4776]: I1208 10:45:28.208358 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-zn4qk_e843ce72-b4b1-4603-8876-05dc121793ed/openstack-network-exporter/0.log" Dec 08 10:45:28 crc kubenswrapper[4776]: I1208 10:45:28.431162 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5tfbk_215a9444-a545-491d-9eb6-02d98baff784/ovsdb-server-init/0.log" Dec 08 10:45:28 crc kubenswrapper[4776]: I1208 10:45:28.585454 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5tfbk_215a9444-a545-491d-9eb6-02d98baff784/ovsdb-server/0.log" Dec 08 10:45:28 crc kubenswrapper[4776]: I1208 10:45:28.656372 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5tfbk_215a9444-a545-491d-9eb6-02d98baff784/ovs-vswitchd/0.log" Dec 08 10:45:28 crc kubenswrapper[4776]: I1208 10:45:28.679310 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5tfbk_215a9444-a545-491d-9eb6-02d98baff784/ovsdb-server-init/0.log" Dec 08 10:45:28 crc kubenswrapper[4776]: I1208 10:45:28.909390 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-wpgmk_9a9a1b68-ec7e-4994-9bda-fd418747dbc5/ovn-controller/0.log" Dec 08 10:45:29 crc kubenswrapper[4776]: I1208 10:45:29.115545 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-k4c4r_abe6fd93-f916-47f2-854e-fa4d908fa9ad/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:29 crc kubenswrapper[4776]: I1208 10:45:29.119368 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e6c2fb50-f70b-43cc-a493-b4ffa4292c64/nova-metadata-metadata/0.log" Dec 08 10:45:29 crc kubenswrapper[4776]: I1208 10:45:29.234220 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_96dd2435-6c8f-4ac2-9b72-43f82d2eeb52/openstack-network-exporter/0.log" Dec 08 10:45:29 crc kubenswrapper[4776]: I1208 10:45:29.350426 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_96dd2435-6c8f-4ac2-9b72-43f82d2eeb52/ovn-northd/0.log" Dec 08 10:45:29 crc kubenswrapper[4776]: I1208 10:45:29.368755 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0e4de746-d269-470c-b934-117aa4c73834/openstack-network-exporter/0.log" Dec 08 10:45:29 crc kubenswrapper[4776]: I1208 10:45:29.433283 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0e4de746-d269-470c-b934-117aa4c73834/ovsdbserver-nb/0.log" Dec 08 10:45:29 crc kubenswrapper[4776]: I1208 10:45:29.569141 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3d941bbc-2271-4ec4-853f-57feaf6ace36/openstack-network-exporter/0.log" Dec 08 10:45:29 crc kubenswrapper[4776]: I1208 10:45:29.598637 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3d941bbc-2271-4ec4-853f-57feaf6ace36/ovsdbserver-sb/0.log" Dec 08 10:45:30 crc kubenswrapper[4776]: I1208 10:45:30.010839 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-75876fb99b-xnbd7_ae330c18-0140-4bc4-8503-cf6c3bbce3d8/placement-api/0.log" Dec 08 10:45:30 crc kubenswrapper[4776]: I1208 10:45:30.042507 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-75876fb99b-xnbd7_ae330c18-0140-4bc4-8503-cf6c3bbce3d8/placement-log/0.log" Dec 08 10:45:30 crc kubenswrapper[4776]: I1208 10:45:30.062736 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_95be142a-2a8f-4f5c-97e0-2e64e108fb8b/init-config-reloader/0.log" Dec 08 10:45:30 crc kubenswrapper[4776]: I1208 10:45:30.309063 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_95be142a-2a8f-4f5c-97e0-2e64e108fb8b/config-reloader/0.log" Dec 08 10:45:30 crc kubenswrapper[4776]: I1208 10:45:30.312862 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_95be142a-2a8f-4f5c-97e0-2e64e108fb8b/init-config-reloader/0.log" Dec 08 10:45:30 crc kubenswrapper[4776]: I1208 10:45:30.331397 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_95be142a-2a8f-4f5c-97e0-2e64e108fb8b/thanos-sidecar/0.log" Dec 08 10:45:30 crc kubenswrapper[4776]: I1208 10:45:30.368615 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_95be142a-2a8f-4f5c-97e0-2e64e108fb8b/prometheus/0.log" Dec 08 10:45:30 crc kubenswrapper[4776]: I1208 10:45:30.543908 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_87931091-7230-4451-9d94-20ac4b8458bc/setup-container/0.log" Dec 08 10:45:30 crc kubenswrapper[4776]: I1208 10:45:30.745725 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_87931091-7230-4451-9d94-20ac4b8458bc/setup-container/0.log" Dec 08 10:45:30 crc kubenswrapper[4776]: I1208 10:45:30.800765 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_87931091-7230-4451-9d94-20ac4b8458bc/rabbitmq/0.log" Dec 08 10:45:30 crc kubenswrapper[4776]: I1208 10:45:30.887559 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab6303ff-9104-40ed-babe-1445f4cd89e2/setup-container/0.log" Dec 08 10:45:31 crc kubenswrapper[4776]: I1208 10:45:31.080401 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab6303ff-9104-40ed-babe-1445f4cd89e2/setup-container/0.log" Dec 08 10:45:31 crc kubenswrapper[4776]: I1208 10:45:31.113054 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-mkk2n_26d6a987-fa87-4870-97f8-30aa5b38b753/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:31 crc kubenswrapper[4776]: I1208 10:45:31.182958 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab6303ff-9104-40ed-babe-1445f4cd89e2/rabbitmq/0.log" Dec 08 10:45:31 crc kubenswrapper[4776]: I1208 10:45:31.321327 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-xbms7_9419c01b-956b-4781-a8bf-e2e1472ad2cf/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:31 crc kubenswrapper[4776]: I1208 10:45:31.412861 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-h9hfq_8b119f36-1ae0-4826-8043-4e038e4398a3/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:31 crc kubenswrapper[4776]: I1208 10:45:31.584552 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-97xln_31f822a4-fa31-4cae-b24f-a1c1395caf05/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:31 crc kubenswrapper[4776]: I1208 10:45:31.706441 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-cxgqb_60899add-1d95-4fad-8cee-852951046a90/ssh-known-hosts-edpm-deployment/0.log" Dec 08 10:45:32 crc kubenswrapper[4776]: I1208 10:45:32.164747 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5cfbc4c5f-hhnf9_aa5389fb-4ae8-45b1-baaf-18f2fea3f61c/proxy-server/0.log" Dec 08 10:45:32 crc kubenswrapper[4776]: I1208 10:45:32.278403 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5cfbc4c5f-hhnf9_aa5389fb-4ae8-45b1-baaf-18f2fea3f61c/proxy-httpd/0.log" Dec 08 10:45:32 crc kubenswrapper[4776]: I1208 10:45:32.296371 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mmp8z_0436afba-d4b2-47d8-ac4d-c621e029333d/swift-ring-rebalance/0.log" Dec 08 10:45:32 crc kubenswrapper[4776]: I1208 10:45:32.434359 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/account-auditor/0.log" Dec 08 10:45:32 crc kubenswrapper[4776]: I1208 10:45:32.531192 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/account-reaper/0.log" Dec 08 10:45:32 crc kubenswrapper[4776]: I1208 10:45:32.580630 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/account-replicator/0.log" Dec 08 10:45:32 crc kubenswrapper[4776]: I1208 10:45:32.681514 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/container-auditor/0.log" Dec 08 10:45:32 crc kubenswrapper[4776]: I1208 10:45:32.691442 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/account-server/0.log" Dec 08 10:45:32 crc kubenswrapper[4776]: I1208 10:45:32.785751 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/container-replicator/0.log" Dec 08 10:45:32 crc kubenswrapper[4776]: I1208 10:45:32.799853 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/container-server/0.log" Dec 08 10:45:32 crc kubenswrapper[4776]: I1208 10:45:32.934317 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/container-updater/0.log" Dec 08 10:45:32 crc kubenswrapper[4776]: I1208 10:45:32.957507 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/object-auditor/0.log" Dec 08 10:45:32 crc kubenswrapper[4776]: I1208 10:45:32.970551 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/object-expirer/0.log" Dec 08 10:45:33 crc kubenswrapper[4776]: I1208 10:45:33.079790 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/object-replicator/0.log" Dec 08 10:45:33 crc kubenswrapper[4776]: I1208 10:45:33.160047 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/object-server/0.log" Dec 08 10:45:33 crc kubenswrapper[4776]: I1208 10:45:33.191425 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/rsync/0.log" Dec 08 10:45:33 crc kubenswrapper[4776]: I1208 10:45:33.229951 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/object-updater/0.log" Dec 08 10:45:33 crc kubenswrapper[4776]: I1208 10:45:33.286196 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_cb640491-a8e7-4f8d-b4bb-1d0124f5727f/swift-recon-cron/0.log" Dec 08 10:45:33 crc kubenswrapper[4776]: I1208 10:45:33.474447 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-89ghm_b0b1960a-6fc8-4fd1-adb6-9e7b5fe42f0e/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:33 crc kubenswrapper[4776]: I1208 10:45:33.555962 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-zrvjp_18a0027c-b2f9-4c57-9f94-30b31659d298/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:33 crc kubenswrapper[4776]: I1208 10:45:33.761236 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_dcfe5c37-ca0e-44d6-9051-bdf107f11cdb/test-operator-logs-container/0.log" Dec 08 10:45:33 crc kubenswrapper[4776]: I1208 10:45:33.957452 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-js6sf_d8284a3c-c72c-41f5-aefe-bbc881bf969b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 10:45:34 crc kubenswrapper[4776]: I1208 10:45:34.325843 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_9c3d4f25-4353-4b82-8de9-ee14a2f05076/tempest-tests-tempest-tests-runner/0.log" Dec 08 10:45:38 crc kubenswrapper[4776]: I1208 10:45:38.984999 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_981d14af-244f-4679-975d-58e11df95718/memcached/0.log" Dec 08 10:45:41 crc kubenswrapper[4776]: I1208 10:45:41.398669 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:45:41 crc kubenswrapper[4776]: I1208 10:45:41.399205 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:45:59 crc kubenswrapper[4776]: I1208 10:45:59.374255 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67_d003bbac-1fa9-4696-aded-39e4b8d211ff/util/0.log" Dec 08 10:45:59 crc kubenswrapper[4776]: I1208 10:45:59.574127 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67_d003bbac-1fa9-4696-aded-39e4b8d211ff/util/0.log" Dec 08 10:45:59 crc kubenswrapper[4776]: I1208 10:45:59.579024 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67_d003bbac-1fa9-4696-aded-39e4b8d211ff/pull/0.log" Dec 08 10:45:59 crc kubenswrapper[4776]: I1208 10:45:59.583456 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67_d003bbac-1fa9-4696-aded-39e4b8d211ff/pull/0.log" Dec 08 10:45:59 crc kubenswrapper[4776]: I1208 10:45:59.819935 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67_d003bbac-1fa9-4696-aded-39e4b8d211ff/extract/0.log" Dec 08 10:45:59 crc kubenswrapper[4776]: I1208 10:45:59.836408 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67_d003bbac-1fa9-4696-aded-39e4b8d211ff/pull/0.log" Dec 08 10:45:59 crc kubenswrapper[4776]: I1208 10:45:59.843607 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_16991a732252c99b57872cb3554920243a1d5011faf6df1b6d2249e50ajqd67_d003bbac-1fa9-4696-aded-39e4b8d211ff/util/0.log" Dec 08 10:45:59 crc kubenswrapper[4776]: I1208 10:45:59.990543 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rgfzz_b39e8644-6fb7-4d7c-a623-c0eadac0e896/kube-rbac-proxy/0.log" Dec 08 10:46:00 crc kubenswrapper[4776]: I1208 10:46:00.027575 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-2g4ph_ad1d3b70-6eea-46a4-bdc1-82144fe12f4a/kube-rbac-proxy/0.log" Dec 08 10:46:00 crc kubenswrapper[4776]: I1208 10:46:00.113549 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rgfzz_b39e8644-6fb7-4d7c-a623-c0eadac0e896/manager/0.log" Dec 08 10:46:00 crc kubenswrapper[4776]: I1208 10:46:00.190142 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-2g4ph_ad1d3b70-6eea-46a4-bdc1-82144fe12f4a/manager/0.log" Dec 08 10:46:00 crc kubenswrapper[4776]: I1208 10:46:00.279498 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-ftb4x_316c9728-ccef-4981-9903-895ab86e6616/kube-rbac-proxy/0.log" Dec 08 10:46:00 crc kubenswrapper[4776]: I1208 10:46:00.344024 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-ftb4x_316c9728-ccef-4981-9903-895ab86e6616/manager/0.log" Dec 08 10:46:00 crc kubenswrapper[4776]: I1208 10:46:00.466031 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-897nd_bb123983-a71d-4eca-84e8-6c116cc9b3b6/kube-rbac-proxy/0.log" Dec 08 10:46:00 crc kubenswrapper[4776]: I1208 10:46:00.555155 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-897nd_bb123983-a71d-4eca-84e8-6c116cc9b3b6/manager/0.log" Dec 08 10:46:00 crc kubenswrapper[4776]: I1208 10:46:00.573738 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-f2bnk_f85d592d-d82d-4c08-aafb-e9a7e68ef386/kube-rbac-proxy/0.log" Dec 08 10:46:00 crc kubenswrapper[4776]: I1208 10:46:00.748080 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-f2bnk_f85d592d-d82d-4c08-aafb-e9a7e68ef386/manager/0.log" Dec 08 10:46:00 crc kubenswrapper[4776]: I1208 10:46:00.791220 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-4k8qf_beadb3ee-3cd9-4c83-ba1f-9f599cd24940/kube-rbac-proxy/0.log" Dec 08 10:46:00 crc kubenswrapper[4776]: I1208 10:46:00.825973 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-4k8qf_beadb3ee-3cd9-4c83-ba1f-9f599cd24940/manager/0.log" Dec 08 10:46:00 crc kubenswrapper[4776]: I1208 10:46:00.952793 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-87pfw_dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0/kube-rbac-proxy/0.log" Dec 08 10:46:01 crc kubenswrapper[4776]: I1208 10:46:01.199637 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-4dj2x_422088d1-15c7-4791-b0c9-a12a2c5e2880/kube-rbac-proxy/0.log" Dec 08 10:46:01 crc kubenswrapper[4776]: I1208 10:46:01.240096 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-87pfw_dbaeeb10-4e2f-43f7-85a8-8f9ec668e3f0/manager/0.log" Dec 08 10:46:01 crc kubenswrapper[4776]: I1208 10:46:01.242521 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-4dj2x_422088d1-15c7-4791-b0c9-a12a2c5e2880/manager/0.log" Dec 08 10:46:01 crc kubenswrapper[4776]: I1208 10:46:01.417873 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-7smkr_0f590af7-17bd-46c4-8a25-ba3a368c6382/kube-rbac-proxy/0.log" Dec 08 10:46:01 crc kubenswrapper[4776]: I1208 10:46:01.498479 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-7smkr_0f590af7-17bd-46c4-8a25-ba3a368c6382/manager/0.log" Dec 08 10:46:01 crc kubenswrapper[4776]: I1208 10:46:01.634617 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-g66m2_ff110975-7e1d-4d6d-bd10-b666cd8fe98b/kube-rbac-proxy/0.log" Dec 08 10:46:01 crc kubenswrapper[4776]: I1208 10:46:01.667527 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-g66m2_ff110975-7e1d-4d6d-bd10-b666cd8fe98b/manager/0.log" Dec 08 10:46:01 crc kubenswrapper[4776]: I1208 10:46:01.715536 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-jgmdb_2a4ffe83-5f4d-4a7a-a2b6-64d12bd8f3f9/kube-rbac-proxy/0.log" Dec 08 10:46:01 crc kubenswrapper[4776]: I1208 10:46:01.846898 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-jgmdb_2a4ffe83-5f4d-4a7a-a2b6-64d12bd8f3f9/manager/0.log" Dec 08 10:46:01 crc kubenswrapper[4776]: I1208 10:46:01.907288 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-dqgnv_288a9127-92ed-4b19-8cc5-34b1f9b51201/kube-rbac-proxy/0.log" Dec 08 10:46:01 crc kubenswrapper[4776]: I1208 10:46:01.979491 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-dqgnv_288a9127-92ed-4b19-8cc5-34b1f9b51201/manager/0.log" Dec 08 10:46:02 crc kubenswrapper[4776]: I1208 10:46:02.087038 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-k928c_545c7a23-3539-4923-bd9e-8d64700070b5/kube-rbac-proxy/0.log" Dec 08 10:46:02 crc kubenswrapper[4776]: I1208 10:46:02.183102 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-k928c_545c7a23-3539-4923-bd9e-8d64700070b5/manager/0.log" Dec 08 10:46:02 crc kubenswrapper[4776]: I1208 10:46:02.311066 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-l979f_d07c95ca-1871-4ba3-81e5-c7b4d86bb0f4/manager/0.log" Dec 08 10:46:02 crc kubenswrapper[4776]: I1208 10:46:02.314452 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-l979f_d07c95ca-1871-4ba3-81e5-c7b4d86bb0f4/kube-rbac-proxy/0.log" Dec 08 10:46:02 crc kubenswrapper[4776]: I1208 10:46:02.432301 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879f7smn5_0cb0505b-eb0f-4801-841d-8a96fe29e608/kube-rbac-proxy/0.log" Dec 08 10:46:02 crc kubenswrapper[4776]: I1208 10:46:02.571810 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879f7smn5_0cb0505b-eb0f-4801-841d-8a96fe29e608/manager/0.log" Dec 08 10:46:02 crc kubenswrapper[4776]: I1208 10:46:02.993550 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-ndvrw_4bb0bbd1-4377-4f99-b0f3-e657e4c2a792/registry-server/0.log" Dec 08 10:46:03 crc kubenswrapper[4776]: I1208 10:46:03.040334 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5546b8686f-m7kf9_90449ceb-bf22-41c3-a66a-3f01c6e46edc/operator/0.log" Dec 08 10:46:03 crc kubenswrapper[4776]: I1208 10:46:03.215089 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-gk9xw_8cd2dc5d-1433-4660-9d65-bf49d398415f/kube-rbac-proxy/0.log" Dec 08 10:46:03 crc kubenswrapper[4776]: I1208 10:46:03.321808 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-gk9xw_8cd2dc5d-1433-4660-9d65-bf49d398415f/manager/0.log" Dec 08 10:46:03 crc kubenswrapper[4776]: I1208 10:46:03.436475 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-mdm5f_482e5641-8a00-4fc3-b7d3-6eb88dbee1e4/kube-rbac-proxy/0.log" Dec 08 10:46:03 crc kubenswrapper[4776]: I1208 10:46:03.458906 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-mdm5f_482e5641-8a00-4fc3-b7d3-6eb88dbee1e4/manager/0.log" Dec 08 10:46:03 crc kubenswrapper[4776]: I1208 10:46:03.609644 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xxv7g_d8a1143b-5dc6-4a99-a6e4-f155585ebbcb/operator/0.log" Dec 08 10:46:03 crc kubenswrapper[4776]: I1208 10:46:03.746786 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-ncfrf_6ea3ffdd-a922-487e-a738-da3091a1656e/kube-rbac-proxy/0.log" Dec 08 10:46:03 crc kubenswrapper[4776]: I1208 10:46:03.846320 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-ncfrf_6ea3ffdd-a922-487e-a738-da3091a1656e/manager/0.log" Dec 08 10:46:03 crc kubenswrapper[4776]: I1208 10:46:03.984753 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-68f9cdc5f7-scgrq_7134ec23-7ec3-454d-b837-29fbe7094067/kube-rbac-proxy/0.log" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.009118 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-57686cd5df-zt7pj_e7f1ff45-22cc-41bd-a0a3-0b5ed3d66a23/manager/0.log" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.211978 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-xtf2f_c8f3f832-68f1-47a2-bb3d-5d67f54655ce/kube-rbac-proxy/0.log" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.219914 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-68f9cdc5f7-scgrq_7134ec23-7ec3-454d-b837-29fbe7094067/manager/0.log" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.249683 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-xtf2f_c8f3f832-68f1-47a2-bb3d-5d67f54655ce/manager/0.log" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.252489 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kkm6t"] Dec 08 10:46:04 crc kubenswrapper[4776]: E1208 10:46:04.253117 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ec2fdc2-b1f0-41c6-b462-263f05f3143c" containerName="collect-profiles" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.253139 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ec2fdc2-b1f0-41c6-b462-263f05f3143c" containerName="collect-profiles" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.253421 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ec2fdc2-b1f0-41c6-b462-263f05f3143c" containerName="collect-profiles" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.255550 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.262764 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkm6t"] Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.403046 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-kfz2m_61424c2d-bdc7-431a-8f12-535e1e97ce4b/kube-rbac-proxy/0.log" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.425389 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86p98\" (UniqueName: \"kubernetes.io/projected/ba6b02b2-693e-4508-8248-0e5b0a38e313-kube-api-access-86p98\") pod \"redhat-marketplace-kkm6t\" (UID: \"ba6b02b2-693e-4508-8248-0e5b0a38e313\") " pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.425581 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba6b02b2-693e-4508-8248-0e5b0a38e313-catalog-content\") pod \"redhat-marketplace-kkm6t\" (UID: \"ba6b02b2-693e-4508-8248-0e5b0a38e313\") " pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.425712 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba6b02b2-693e-4508-8248-0e5b0a38e313-utilities\") pod \"redhat-marketplace-kkm6t\" (UID: \"ba6b02b2-693e-4508-8248-0e5b0a38e313\") " pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.462480 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-kfz2m_61424c2d-bdc7-431a-8f12-535e1e97ce4b/manager/0.log" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.528424 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba6b02b2-693e-4508-8248-0e5b0a38e313-catalog-content\") pod \"redhat-marketplace-kkm6t\" (UID: \"ba6b02b2-693e-4508-8248-0e5b0a38e313\") " pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.528510 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba6b02b2-693e-4508-8248-0e5b0a38e313-utilities\") pod \"redhat-marketplace-kkm6t\" (UID: \"ba6b02b2-693e-4508-8248-0e5b0a38e313\") " pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.528801 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86p98\" (UniqueName: \"kubernetes.io/projected/ba6b02b2-693e-4508-8248-0e5b0a38e313-kube-api-access-86p98\") pod \"redhat-marketplace-kkm6t\" (UID: \"ba6b02b2-693e-4508-8248-0e5b0a38e313\") " pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.530570 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba6b02b2-693e-4508-8248-0e5b0a38e313-utilities\") pod \"redhat-marketplace-kkm6t\" (UID: \"ba6b02b2-693e-4508-8248-0e5b0a38e313\") " pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.530826 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba6b02b2-693e-4508-8248-0e5b0a38e313-catalog-content\") pod \"redhat-marketplace-kkm6t\" (UID: \"ba6b02b2-693e-4508-8248-0e5b0a38e313\") " pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.558590 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86p98\" (UniqueName: \"kubernetes.io/projected/ba6b02b2-693e-4508-8248-0e5b0a38e313-kube-api-access-86p98\") pod \"redhat-marketplace-kkm6t\" (UID: \"ba6b02b2-693e-4508-8248-0e5b0a38e313\") " pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:04 crc kubenswrapper[4776]: I1208 10:46:04.582396 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:05 crc kubenswrapper[4776]: I1208 10:46:05.374243 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkm6t"] Dec 08 10:46:06 crc kubenswrapper[4776]: I1208 10:46:06.362611 4776 generic.go:334] "Generic (PLEG): container finished" podID="ba6b02b2-693e-4508-8248-0e5b0a38e313" containerID="ba39a2b161bccb0d0f0e9eb5eed680c3bc50e784f56baef0b93210d79d996f4b" exitCode=0 Dec 08 10:46:06 crc kubenswrapper[4776]: I1208 10:46:06.362717 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkm6t" event={"ID":"ba6b02b2-693e-4508-8248-0e5b0a38e313","Type":"ContainerDied","Data":"ba39a2b161bccb0d0f0e9eb5eed680c3bc50e784f56baef0b93210d79d996f4b"} Dec 08 10:46:06 crc kubenswrapper[4776]: I1208 10:46:06.362925 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkm6t" event={"ID":"ba6b02b2-693e-4508-8248-0e5b0a38e313","Type":"ContainerStarted","Data":"e732aa41c91c47e18a5ddd93dd4f4f5a445caa992c8359e3534c72b3a849eb7c"} Dec 08 10:46:07 crc kubenswrapper[4776]: I1208 10:46:07.374137 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkm6t" event={"ID":"ba6b02b2-693e-4508-8248-0e5b0a38e313","Type":"ContainerStarted","Data":"765b496cea0f550f4cfc62f9e8811e103e466f76804c1459f11d08082197aaf4"} Dec 08 10:46:08 crc kubenswrapper[4776]: I1208 10:46:08.384183 4776 generic.go:334] "Generic (PLEG): container finished" podID="ba6b02b2-693e-4508-8248-0e5b0a38e313" containerID="765b496cea0f550f4cfc62f9e8811e103e466f76804c1459f11d08082197aaf4" exitCode=0 Dec 08 10:46:08 crc kubenswrapper[4776]: I1208 10:46:08.384227 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkm6t" event={"ID":"ba6b02b2-693e-4508-8248-0e5b0a38e313","Type":"ContainerDied","Data":"765b496cea0f550f4cfc62f9e8811e103e466f76804c1459f11d08082197aaf4"} Dec 08 10:46:09 crc kubenswrapper[4776]: I1208 10:46:09.395845 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkm6t" event={"ID":"ba6b02b2-693e-4508-8248-0e5b0a38e313","Type":"ContainerStarted","Data":"f4bf2afa4d09de466bcdd163612fecec7006312bbabd06eb93d57521f01511ad"} Dec 08 10:46:09 crc kubenswrapper[4776]: I1208 10:46:09.437627 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kkm6t" podStartSLOduration=3.003156971 podStartE2EDuration="5.437555231s" podCreationTimestamp="2025-12-08 10:46:04 +0000 UTC" firstStartedPulling="2025-12-08 10:46:06.364767128 +0000 UTC m=+6442.627992150" lastFinishedPulling="2025-12-08 10:46:08.799165388 +0000 UTC m=+6445.062390410" observedRunningTime="2025-12-08 10:46:09.425320485 +0000 UTC m=+6445.688545507" watchObservedRunningTime="2025-12-08 10:46:09.437555231 +0000 UTC m=+6445.700780253" Dec 08 10:46:11 crc kubenswrapper[4776]: I1208 10:46:11.399204 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:46:11 crc kubenswrapper[4776]: I1208 10:46:11.399284 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 10:46:11 crc kubenswrapper[4776]: I1208 10:46:11.399336 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" Dec 08 10:46:11 crc kubenswrapper[4776]: I1208 10:46:11.400364 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b"} pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 10:46:11 crc kubenswrapper[4776]: I1208 10:46:11.400420 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" containerID="cri-o://e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" gracePeriod=600 Dec 08 10:46:11 crc kubenswrapper[4776]: E1208 10:46:11.520784 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:46:12 crc kubenswrapper[4776]: I1208 10:46:12.428748 4776 generic.go:334] "Generic (PLEG): container finished" podID="c9788ab1-1031-4103-a769-a4b3177c7268" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" exitCode=0 Dec 08 10:46:12 crc kubenswrapper[4776]: I1208 10:46:12.428784 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerDied","Data":"e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b"} Dec 08 10:46:12 crc kubenswrapper[4776]: I1208 10:46:12.428851 4776 scope.go:117] "RemoveContainer" containerID="204440af754cab96c5a4e55db9a243723d836f694200606bc30e8bb3cce0cb54" Dec 08 10:46:12 crc kubenswrapper[4776]: I1208 10:46:12.429516 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:46:12 crc kubenswrapper[4776]: E1208 10:46:12.429854 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:46:14 crc kubenswrapper[4776]: I1208 10:46:14.583719 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:14 crc kubenswrapper[4776]: I1208 10:46:14.584247 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:14 crc kubenswrapper[4776]: I1208 10:46:14.640857 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:15 crc kubenswrapper[4776]: I1208 10:46:15.525267 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:15 crc kubenswrapper[4776]: I1208 10:46:15.575785 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkm6t"] Dec 08 10:46:17 crc kubenswrapper[4776]: I1208 10:46:17.481493 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kkm6t" podUID="ba6b02b2-693e-4508-8248-0e5b0a38e313" containerName="registry-server" containerID="cri-o://f4bf2afa4d09de466bcdd163612fecec7006312bbabd06eb93d57521f01511ad" gracePeriod=2 Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.035254 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.153251 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba6b02b2-693e-4508-8248-0e5b0a38e313-utilities\") pod \"ba6b02b2-693e-4508-8248-0e5b0a38e313\" (UID: \"ba6b02b2-693e-4508-8248-0e5b0a38e313\") " Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.153311 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86p98\" (UniqueName: \"kubernetes.io/projected/ba6b02b2-693e-4508-8248-0e5b0a38e313-kube-api-access-86p98\") pod \"ba6b02b2-693e-4508-8248-0e5b0a38e313\" (UID: \"ba6b02b2-693e-4508-8248-0e5b0a38e313\") " Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.153392 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba6b02b2-693e-4508-8248-0e5b0a38e313-catalog-content\") pod \"ba6b02b2-693e-4508-8248-0e5b0a38e313\" (UID: \"ba6b02b2-693e-4508-8248-0e5b0a38e313\") " Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.154165 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba6b02b2-693e-4508-8248-0e5b0a38e313-utilities" (OuterVolumeSpecName: "utilities") pod "ba6b02b2-693e-4508-8248-0e5b0a38e313" (UID: "ba6b02b2-693e-4508-8248-0e5b0a38e313"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.163261 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba6b02b2-693e-4508-8248-0e5b0a38e313-kube-api-access-86p98" (OuterVolumeSpecName: "kube-api-access-86p98") pod "ba6b02b2-693e-4508-8248-0e5b0a38e313" (UID: "ba6b02b2-693e-4508-8248-0e5b0a38e313"). InnerVolumeSpecName "kube-api-access-86p98". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.183796 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba6b02b2-693e-4508-8248-0e5b0a38e313-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba6b02b2-693e-4508-8248-0e5b0a38e313" (UID: "ba6b02b2-693e-4508-8248-0e5b0a38e313"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.257135 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba6b02b2-693e-4508-8248-0e5b0a38e313-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.257508 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86p98\" (UniqueName: \"kubernetes.io/projected/ba6b02b2-693e-4508-8248-0e5b0a38e313-kube-api-access-86p98\") on node \"crc\" DevicePath \"\"" Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.257612 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba6b02b2-693e-4508-8248-0e5b0a38e313-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.494139 4776 generic.go:334] "Generic (PLEG): container finished" podID="ba6b02b2-693e-4508-8248-0e5b0a38e313" containerID="f4bf2afa4d09de466bcdd163612fecec7006312bbabd06eb93d57521f01511ad" exitCode=0 Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.494193 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkm6t" event={"ID":"ba6b02b2-693e-4508-8248-0e5b0a38e313","Type":"ContainerDied","Data":"f4bf2afa4d09de466bcdd163612fecec7006312bbabd06eb93d57521f01511ad"} Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.494229 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkm6t" event={"ID":"ba6b02b2-693e-4508-8248-0e5b0a38e313","Type":"ContainerDied","Data":"e732aa41c91c47e18a5ddd93dd4f4f5a445caa992c8359e3534c72b3a849eb7c"} Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.494248 4776 scope.go:117] "RemoveContainer" containerID="f4bf2afa4d09de466bcdd163612fecec7006312bbabd06eb93d57521f01511ad" Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.494279 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kkm6t" Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.526918 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkm6t"] Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.531641 4776 scope.go:117] "RemoveContainer" containerID="765b496cea0f550f4cfc62f9e8811e103e466f76804c1459f11d08082197aaf4" Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.540608 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkm6t"] Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.556141 4776 scope.go:117] "RemoveContainer" containerID="ba39a2b161bccb0d0f0e9eb5eed680c3bc50e784f56baef0b93210d79d996f4b" Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.626148 4776 scope.go:117] "RemoveContainer" containerID="f4bf2afa4d09de466bcdd163612fecec7006312bbabd06eb93d57521f01511ad" Dec 08 10:46:18 crc kubenswrapper[4776]: E1208 10:46:18.626706 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4bf2afa4d09de466bcdd163612fecec7006312bbabd06eb93d57521f01511ad\": container with ID starting with f4bf2afa4d09de466bcdd163612fecec7006312bbabd06eb93d57521f01511ad not found: ID does not exist" containerID="f4bf2afa4d09de466bcdd163612fecec7006312bbabd06eb93d57521f01511ad" Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.626765 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4bf2afa4d09de466bcdd163612fecec7006312bbabd06eb93d57521f01511ad"} err="failed to get container status \"f4bf2afa4d09de466bcdd163612fecec7006312bbabd06eb93d57521f01511ad\": rpc error: code = NotFound desc = could not find container \"f4bf2afa4d09de466bcdd163612fecec7006312bbabd06eb93d57521f01511ad\": container with ID starting with f4bf2afa4d09de466bcdd163612fecec7006312bbabd06eb93d57521f01511ad not found: ID does not exist" Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.626799 4776 scope.go:117] "RemoveContainer" containerID="765b496cea0f550f4cfc62f9e8811e103e466f76804c1459f11d08082197aaf4" Dec 08 10:46:18 crc kubenswrapper[4776]: E1208 10:46:18.627320 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"765b496cea0f550f4cfc62f9e8811e103e466f76804c1459f11d08082197aaf4\": container with ID starting with 765b496cea0f550f4cfc62f9e8811e103e466f76804c1459f11d08082197aaf4 not found: ID does not exist" containerID="765b496cea0f550f4cfc62f9e8811e103e466f76804c1459f11d08082197aaf4" Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.627372 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"765b496cea0f550f4cfc62f9e8811e103e466f76804c1459f11d08082197aaf4"} err="failed to get container status \"765b496cea0f550f4cfc62f9e8811e103e466f76804c1459f11d08082197aaf4\": rpc error: code = NotFound desc = could not find container \"765b496cea0f550f4cfc62f9e8811e103e466f76804c1459f11d08082197aaf4\": container with ID starting with 765b496cea0f550f4cfc62f9e8811e103e466f76804c1459f11d08082197aaf4 not found: ID does not exist" Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.627401 4776 scope.go:117] "RemoveContainer" containerID="ba39a2b161bccb0d0f0e9eb5eed680c3bc50e784f56baef0b93210d79d996f4b" Dec 08 10:46:18 crc kubenswrapper[4776]: E1208 10:46:18.627819 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba39a2b161bccb0d0f0e9eb5eed680c3bc50e784f56baef0b93210d79d996f4b\": container with ID starting with ba39a2b161bccb0d0f0e9eb5eed680c3bc50e784f56baef0b93210d79d996f4b not found: ID does not exist" containerID="ba39a2b161bccb0d0f0e9eb5eed680c3bc50e784f56baef0b93210d79d996f4b" Dec 08 10:46:18 crc kubenswrapper[4776]: I1208 10:46:18.627844 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba39a2b161bccb0d0f0e9eb5eed680c3bc50e784f56baef0b93210d79d996f4b"} err="failed to get container status \"ba39a2b161bccb0d0f0e9eb5eed680c3bc50e784f56baef0b93210d79d996f4b\": rpc error: code = NotFound desc = could not find container \"ba39a2b161bccb0d0f0e9eb5eed680c3bc50e784f56baef0b93210d79d996f4b\": container with ID starting with ba39a2b161bccb0d0f0e9eb5eed680c3bc50e784f56baef0b93210d79d996f4b not found: ID does not exist" Dec 08 10:46:20 crc kubenswrapper[4776]: I1208 10:46:20.357003 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba6b02b2-693e-4508-8248-0e5b0a38e313" path="/var/lib/kubelet/pods/ba6b02b2-693e-4508-8248-0e5b0a38e313/volumes" Dec 08 10:46:23 crc kubenswrapper[4776]: I1208 10:46:23.164519 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-njv2h_35911247-ad00-422c-9d30-586834a80f76/control-plane-machine-set-operator/0.log" Dec 08 10:46:23 crc kubenswrapper[4776]: I1208 10:46:23.319934 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jxkd8_5ea3906e-d311-4b90-80be-7405507e135e/kube-rbac-proxy/0.log" Dec 08 10:46:23 crc kubenswrapper[4776]: I1208 10:46:23.336806 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jxkd8_5ea3906e-d311-4b90-80be-7405507e135e/machine-api-operator/0.log" Dec 08 10:46:26 crc kubenswrapper[4776]: I1208 10:46:26.343938 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:46:26 crc kubenswrapper[4776]: E1208 10:46:26.344712 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:46:35 crc kubenswrapper[4776]: I1208 10:46:35.456456 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-hsqbv_69b03c85-8503-44fc-9e71-0357ce0cc56e/cert-manager-controller/0.log" Dec 08 10:46:35 crc kubenswrapper[4776]: I1208 10:46:35.554929 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-6vnhc_8b23f8e2-638b-438a-8363-8daf30f656e6/cert-manager-cainjector/0.log" Dec 08 10:46:35 crc kubenswrapper[4776]: I1208 10:46:35.694351 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-6jjjz_4f663316-a0ef-44bd-a068-47f3e7d37a5c/cert-manager-webhook/0.log" Dec 08 10:46:37 crc kubenswrapper[4776]: I1208 10:46:37.344237 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:46:37 crc kubenswrapper[4776]: E1208 10:46:37.344928 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:46:47 crc kubenswrapper[4776]: I1208 10:46:47.608051 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-8ds4l_c2ba126f-aa28-4cfa-8aed-a9221e094a58/nmstate-console-plugin/0.log" Dec 08 10:46:47 crc kubenswrapper[4776]: I1208 10:46:47.785657 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-b8ckr_5e0ef761-506d-4695-b58a-128a6f5f7957/nmstate-handler/0.log" Dec 08 10:46:47 crc kubenswrapper[4776]: I1208 10:46:47.815650 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-lpbj6_b26e1190-f68a-487b-a2de-e0116525ab64/kube-rbac-proxy/0.log" Dec 08 10:46:47 crc kubenswrapper[4776]: I1208 10:46:47.857389 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-lpbj6_b26e1190-f68a-487b-a2de-e0116525ab64/nmstate-metrics/0.log" Dec 08 10:46:48 crc kubenswrapper[4776]: I1208 10:46:48.002474 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-n9gb5_5322a22f-cb6b-45df-af5f-395b2180a64b/nmstate-operator/0.log" Dec 08 10:46:48 crc kubenswrapper[4776]: I1208 10:46:48.097650 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-69dpx_e59b99c1-c9b8-4127-93db-933dddb3ebab/nmstate-webhook/0.log" Dec 08 10:46:49 crc kubenswrapper[4776]: I1208 10:46:49.343509 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:46:49 crc kubenswrapper[4776]: E1208 10:46:49.344949 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:46:59 crc kubenswrapper[4776]: I1208 10:46:59.721864 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6bfc99889d-tq44h_9ba0d9e5-f1ab-40a6-9490-57ce8566843a/kube-rbac-proxy/0.log" Dec 08 10:46:59 crc kubenswrapper[4776]: I1208 10:46:59.763047 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6bfc99889d-tq44h_9ba0d9e5-f1ab-40a6-9490-57ce8566843a/manager/0.log" Dec 08 10:47:03 crc kubenswrapper[4776]: I1208 10:47:03.344129 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:47:03 crc kubenswrapper[4776]: E1208 10:47:03.344843 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:47:13 crc kubenswrapper[4776]: I1208 10:47:13.956674 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-wwf9p_3e4917d5-3292-4cd8-b001-6d6bf5609def/cluster-logging-operator/0.log" Dec 08 10:47:14 crc kubenswrapper[4776]: I1208 10:47:14.103248 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-2xfbc_847cd111-98c5-4c39-bc29-1ba2bcdf570c/collector/0.log" Dec 08 10:47:14 crc kubenswrapper[4776]: I1208 10:47:14.176710 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_62fa460d-4457-4db0-8be1-d7fa62fd7144/loki-compactor/0.log" Dec 08 10:47:14 crc kubenswrapper[4776]: I1208 10:47:14.283552 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-w6292_9edcc5bd-cefb-4c32-89e3-24ff105358b2/loki-distributor/0.log" Dec 08 10:47:14 crc kubenswrapper[4776]: I1208 10:47:14.516048 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-54b997fdcc-9dpbh_11400c14-964a-494f-80da-d878c6d2a50d/gateway/0.log" Dec 08 10:47:14 crc kubenswrapper[4776]: I1208 10:47:14.582183 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-54b997fdcc-9dpbh_11400c14-964a-494f-80da-d878c6d2a50d/opa/0.log" Dec 08 10:47:14 crc kubenswrapper[4776]: I1208 10:47:14.759126 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-54b997fdcc-nmww6_d64b61be-4212-49da-9497-f567efa53a45/gateway/0.log" Dec 08 10:47:14 crc kubenswrapper[4776]: I1208 10:47:14.767054 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-54b997fdcc-nmww6_d64b61be-4212-49da-9497-f567efa53a45/opa/0.log" Dec 08 10:47:14 crc kubenswrapper[4776]: I1208 10:47:14.912647 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_1e9dd934-eb37-463c-890d-1021bbdc4e3f/loki-index-gateway/0.log" Dec 08 10:47:15 crc kubenswrapper[4776]: I1208 10:47:15.026420 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_c27c1242-5109-4547-8276-2dea60fad775/loki-ingester/0.log" Dec 08 10:47:15 crc kubenswrapper[4776]: I1208 10:47:15.208267 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-hwrdk_71eade59-504f-4431-8cd8-531883c1eba7/loki-querier/0.log" Dec 08 10:47:15 crc kubenswrapper[4776]: I1208 10:47:15.307058 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-fcz8j_aed8f23a-7437-4eab-8dae-6ff17f9a5aa0/loki-query-frontend/0.log" Dec 08 10:47:16 crc kubenswrapper[4776]: I1208 10:47:16.344136 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:47:16 crc kubenswrapper[4776]: E1208 10:47:16.344713 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:47:29 crc kubenswrapper[4776]: I1208 10:47:29.471469 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-gfgfc_f696bcbd-7230-43d3-b09e-645d489eacf3/kube-rbac-proxy/0.log" Dec 08 10:47:29 crc kubenswrapper[4776]: I1208 10:47:29.557054 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-gfgfc_f696bcbd-7230-43d3-b09e-645d489eacf3/controller/0.log" Dec 08 10:47:29 crc kubenswrapper[4776]: I1208 10:47:29.670091 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-frr-files/0.log" Dec 08 10:47:29 crc kubenswrapper[4776]: I1208 10:47:29.827137 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-frr-files/0.log" Dec 08 10:47:29 crc kubenswrapper[4776]: I1208 10:47:29.865744 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-reloader/0.log" Dec 08 10:47:29 crc kubenswrapper[4776]: I1208 10:47:29.902861 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-metrics/0.log" Dec 08 10:47:29 crc kubenswrapper[4776]: I1208 10:47:29.914111 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-reloader/0.log" Dec 08 10:47:30 crc kubenswrapper[4776]: I1208 10:47:30.067930 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-frr-files/0.log" Dec 08 10:47:30 crc kubenswrapper[4776]: I1208 10:47:30.078976 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-metrics/0.log" Dec 08 10:47:30 crc kubenswrapper[4776]: I1208 10:47:30.123557 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-metrics/0.log" Dec 08 10:47:30 crc kubenswrapper[4776]: I1208 10:47:30.141125 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-reloader/0.log" Dec 08 10:47:30 crc kubenswrapper[4776]: I1208 10:47:30.364526 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-reloader/0.log" Dec 08 10:47:30 crc kubenswrapper[4776]: I1208 10:47:30.366638 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-frr-files/0.log" Dec 08 10:47:30 crc kubenswrapper[4776]: I1208 10:47:30.408658 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/controller/0.log" Dec 08 10:47:30 crc kubenswrapper[4776]: I1208 10:47:30.410830 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/cp-metrics/0.log" Dec 08 10:47:30 crc kubenswrapper[4776]: I1208 10:47:30.578807 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/frr-metrics/0.log" Dec 08 10:47:30 crc kubenswrapper[4776]: I1208 10:47:30.604041 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/kube-rbac-proxy/0.log" Dec 08 10:47:30 crc kubenswrapper[4776]: I1208 10:47:30.659326 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/kube-rbac-proxy-frr/0.log" Dec 08 10:47:30 crc kubenswrapper[4776]: I1208 10:47:30.776097 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/reloader/0.log" Dec 08 10:47:30 crc kubenswrapper[4776]: I1208 10:47:30.913204 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-c68cp_640e92fd-0236-408e-95ba-a5aacfe784d4/frr-k8s-webhook-server/0.log" Dec 08 10:47:31 crc kubenswrapper[4776]: I1208 10:47:31.123784 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7567df7f9b-ctl76_579d6f99-9917-455f-b0cf-350c24bae128/manager/0.log" Dec 08 10:47:31 crc kubenswrapper[4776]: I1208 10:47:31.230138 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7f595f4d5-92ttw_ea83e974-be12-4152-bd97-0d699c0e13b2/webhook-server/0.log" Dec 08 10:47:31 crc kubenswrapper[4776]: I1208 10:47:31.344424 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:47:31 crc kubenswrapper[4776]: E1208 10:47:31.344929 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:47:31 crc kubenswrapper[4776]: I1208 10:47:31.490838 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gt7wq_4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6/kube-rbac-proxy/0.log" Dec 08 10:47:32 crc kubenswrapper[4776]: I1208 10:47:32.087226 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gt7wq_4a15dc76-0fc2-4f38-ae64-d7efa5ee29a6/speaker/0.log" Dec 08 10:47:32 crc kubenswrapper[4776]: I1208 10:47:32.541033 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ph98m_12fb1453-ed3a-4c22-b33b-8c8e5402de93/frr/0.log" Dec 08 10:47:44 crc kubenswrapper[4776]: I1208 10:47:44.356280 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:47:44 crc kubenswrapper[4776]: E1208 10:47:44.357167 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:47:44 crc kubenswrapper[4776]: I1208 10:47:44.682006 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww_20cd1aea-6a8d-458a-8697-f9193cfa6058/util/0.log" Dec 08 10:47:44 crc kubenswrapper[4776]: I1208 10:47:44.873526 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww_20cd1aea-6a8d-458a-8697-f9193cfa6058/pull/0.log" Dec 08 10:47:44 crc kubenswrapper[4776]: I1208 10:47:44.877687 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww_20cd1aea-6a8d-458a-8697-f9193cfa6058/pull/0.log" Dec 08 10:47:44 crc kubenswrapper[4776]: I1208 10:47:44.887611 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww_20cd1aea-6a8d-458a-8697-f9193cfa6058/util/0.log" Dec 08 10:47:45 crc kubenswrapper[4776]: I1208 10:47:45.106017 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww_20cd1aea-6a8d-458a-8697-f9193cfa6058/util/0.log" Dec 08 10:47:45 crc kubenswrapper[4776]: I1208 10:47:45.108783 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww_20cd1aea-6a8d-458a-8697-f9193cfa6058/pull/0.log" Dec 08 10:47:45 crc kubenswrapper[4776]: I1208 10:47:45.162210 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb85pvww_20cd1aea-6a8d-458a-8697-f9193cfa6058/extract/0.log" Dec 08 10:47:45 crc kubenswrapper[4776]: I1208 10:47:45.297575 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q_8567f1db-9f8a-49aa-8864-e18aef8b18e7/util/0.log" Dec 08 10:47:45 crc kubenswrapper[4776]: I1208 10:47:45.518325 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q_8567f1db-9f8a-49aa-8864-e18aef8b18e7/util/0.log" Dec 08 10:47:45 crc kubenswrapper[4776]: I1208 10:47:45.529109 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q_8567f1db-9f8a-49aa-8864-e18aef8b18e7/pull/0.log" Dec 08 10:47:45 crc kubenswrapper[4776]: I1208 10:47:45.571826 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q_8567f1db-9f8a-49aa-8864-e18aef8b18e7/pull/0.log" Dec 08 10:47:45 crc kubenswrapper[4776]: I1208 10:47:45.717776 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q_8567f1db-9f8a-49aa-8864-e18aef8b18e7/pull/0.log" Dec 08 10:47:45 crc kubenswrapper[4776]: I1208 10:47:45.722207 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q_8567f1db-9f8a-49aa-8864-e18aef8b18e7/extract/0.log" Dec 08 10:47:45 crc kubenswrapper[4776]: I1208 10:47:45.733470 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9x89q_8567f1db-9f8a-49aa-8864-e18aef8b18e7/util/0.log" Dec 08 10:47:45 crc kubenswrapper[4776]: I1208 10:47:45.919157 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5_bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b/util/0.log" Dec 08 10:47:46 crc kubenswrapper[4776]: I1208 10:47:46.097593 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5_bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b/pull/0.log" Dec 08 10:47:46 crc kubenswrapper[4776]: I1208 10:47:46.110931 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5_bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b/pull/0.log" Dec 08 10:47:46 crc kubenswrapper[4776]: I1208 10:47:46.134143 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5_bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b/util/0.log" Dec 08 10:47:46 crc kubenswrapper[4776]: I1208 10:47:46.321346 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5_bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b/extract/0.log" Dec 08 10:47:46 crc kubenswrapper[4776]: I1208 10:47:46.332577 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5_bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b/util/0.log" Dec 08 10:47:46 crc kubenswrapper[4776]: I1208 10:47:46.365155 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210wkgz5_bcf88dc0-ae1d-4475-aee4-d2a3fdb1254b/pull/0.log" Dec 08 10:47:46 crc kubenswrapper[4776]: I1208 10:47:46.491762 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k_0b36dfd9-c3b8-4858-b056-70d04434052a/util/0.log" Dec 08 10:47:46 crc kubenswrapper[4776]: I1208 10:47:46.718929 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k_0b36dfd9-c3b8-4858-b056-70d04434052a/util/0.log" Dec 08 10:47:46 crc kubenswrapper[4776]: I1208 10:47:46.740323 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k_0b36dfd9-c3b8-4858-b056-70d04434052a/pull/0.log" Dec 08 10:47:46 crc kubenswrapper[4776]: I1208 10:47:46.742336 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k_0b36dfd9-c3b8-4858-b056-70d04434052a/pull/0.log" Dec 08 10:47:47 crc kubenswrapper[4776]: I1208 10:47:47.178887 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k_0b36dfd9-c3b8-4858-b056-70d04434052a/extract/0.log" Dec 08 10:47:47 crc kubenswrapper[4776]: I1208 10:47:47.180632 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k_0b36dfd9-c3b8-4858-b056-70d04434052a/pull/0.log" Dec 08 10:47:47 crc kubenswrapper[4776]: I1208 10:47:47.187285 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fr5w6k_0b36dfd9-c3b8-4858-b056-70d04434052a/util/0.log" Dec 08 10:47:47 crc kubenswrapper[4776]: I1208 10:47:47.346233 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl_ecb04392-c8da-4ee9-ae5f-aa7212b963e9/util/0.log" Dec 08 10:47:47 crc kubenswrapper[4776]: I1208 10:47:47.509614 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl_ecb04392-c8da-4ee9-ae5f-aa7212b963e9/util/0.log" Dec 08 10:47:47 crc kubenswrapper[4776]: I1208 10:47:47.546757 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl_ecb04392-c8da-4ee9-ae5f-aa7212b963e9/pull/0.log" Dec 08 10:47:47 crc kubenswrapper[4776]: I1208 10:47:47.555361 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl_ecb04392-c8da-4ee9-ae5f-aa7212b963e9/pull/0.log" Dec 08 10:47:47 crc kubenswrapper[4776]: I1208 10:47:47.751379 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl_ecb04392-c8da-4ee9-ae5f-aa7212b963e9/pull/0.log" Dec 08 10:47:47 crc kubenswrapper[4776]: I1208 10:47:47.774224 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl_ecb04392-c8da-4ee9-ae5f-aa7212b963e9/extract/0.log" Dec 08 10:47:47 crc kubenswrapper[4776]: I1208 10:47:47.775629 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nx2vl_ecb04392-c8da-4ee9-ae5f-aa7212b963e9/util/0.log" Dec 08 10:47:47 crc kubenswrapper[4776]: I1208 10:47:47.933811 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dqj6b_03a186d8-ec36-41ef-b882-f0cba34a0913/extract-utilities/0.log" Dec 08 10:47:48 crc kubenswrapper[4776]: I1208 10:47:48.079338 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dqj6b_03a186d8-ec36-41ef-b882-f0cba34a0913/extract-utilities/0.log" Dec 08 10:47:48 crc kubenswrapper[4776]: I1208 10:47:48.118029 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dqj6b_03a186d8-ec36-41ef-b882-f0cba34a0913/extract-content/0.log" Dec 08 10:47:48 crc kubenswrapper[4776]: I1208 10:47:48.145716 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dqj6b_03a186d8-ec36-41ef-b882-f0cba34a0913/extract-content/0.log" Dec 08 10:47:48 crc kubenswrapper[4776]: I1208 10:47:48.388914 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dqj6b_03a186d8-ec36-41ef-b882-f0cba34a0913/extract-content/0.log" Dec 08 10:47:48 crc kubenswrapper[4776]: I1208 10:47:48.408655 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dqj6b_03a186d8-ec36-41ef-b882-f0cba34a0913/extract-utilities/0.log" Dec 08 10:47:48 crc kubenswrapper[4776]: I1208 10:47:48.664247 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbmh8_d01dc6cb-3ab5-494a-b89f-63e94c2e91ee/extract-utilities/0.log" Dec 08 10:47:48 crc kubenswrapper[4776]: I1208 10:47:48.876598 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbmh8_d01dc6cb-3ab5-494a-b89f-63e94c2e91ee/extract-content/0.log" Dec 08 10:47:48 crc kubenswrapper[4776]: I1208 10:47:48.877941 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbmh8_d01dc6cb-3ab5-494a-b89f-63e94c2e91ee/extract-utilities/0.log" Dec 08 10:47:48 crc kubenswrapper[4776]: I1208 10:47:48.984910 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbmh8_d01dc6cb-3ab5-494a-b89f-63e94c2e91ee/extract-content/0.log" Dec 08 10:47:49 crc kubenswrapper[4776]: I1208 10:47:49.237705 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbmh8_d01dc6cb-3ab5-494a-b89f-63e94c2e91ee/extract-content/0.log" Dec 08 10:47:49 crc kubenswrapper[4776]: I1208 10:47:49.299246 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dqj6b_03a186d8-ec36-41ef-b882-f0cba34a0913/registry-server/0.log" Dec 08 10:47:49 crc kubenswrapper[4776]: I1208 10:47:49.382721 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbmh8_d01dc6cb-3ab5-494a-b89f-63e94c2e91ee/extract-utilities/0.log" Dec 08 10:47:49 crc kubenswrapper[4776]: I1208 10:47:49.693130 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-txmws_aeec433a-2b07-4008-9829-d266f85b5cf1/extract-utilities/0.log" Dec 08 10:47:49 crc kubenswrapper[4776]: I1208 10:47:49.721131 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8ctxn_b93e12f9-d5c1-4ee8-9786-85d352d62076/marketplace-operator/0.log" Dec 08 10:47:49 crc kubenswrapper[4776]: I1208 10:47:49.912150 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-txmws_aeec433a-2b07-4008-9829-d266f85b5cf1/extract-content/0.log" Dec 08 10:47:49 crc kubenswrapper[4776]: I1208 10:47:49.929542 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-txmws_aeec433a-2b07-4008-9829-d266f85b5cf1/extract-utilities/0.log" Dec 08 10:47:49 crc kubenswrapper[4776]: I1208 10:47:49.983785 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-txmws_aeec433a-2b07-4008-9829-d266f85b5cf1/extract-content/0.log" Dec 08 10:47:50 crc kubenswrapper[4776]: I1208 10:47:50.242456 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-txmws_aeec433a-2b07-4008-9829-d266f85b5cf1/extract-utilities/0.log" Dec 08 10:47:50 crc kubenswrapper[4776]: I1208 10:47:50.243678 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-txmws_aeec433a-2b07-4008-9829-d266f85b5cf1/extract-content/0.log" Dec 08 10:47:50 crc kubenswrapper[4776]: I1208 10:47:50.452575 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbmh8_d01dc6cb-3ab5-494a-b89f-63e94c2e91ee/registry-server/0.log" Dec 08 10:47:50 crc kubenswrapper[4776]: I1208 10:47:50.549767 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5dnr_885bc336-6858-43f4-b63b-155ed1f06b60/extract-utilities/0.log" Dec 08 10:47:50 crc kubenswrapper[4776]: I1208 10:47:50.558410 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-txmws_aeec433a-2b07-4008-9829-d266f85b5cf1/registry-server/0.log" Dec 08 10:47:50 crc kubenswrapper[4776]: I1208 10:47:50.676117 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5dnr_885bc336-6858-43f4-b63b-155ed1f06b60/extract-content/0.log" Dec 08 10:47:50 crc kubenswrapper[4776]: I1208 10:47:50.687986 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5dnr_885bc336-6858-43f4-b63b-155ed1f06b60/extract-content/0.log" Dec 08 10:47:50 crc kubenswrapper[4776]: I1208 10:47:50.690199 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5dnr_885bc336-6858-43f4-b63b-155ed1f06b60/extract-utilities/0.log" Dec 08 10:47:50 crc kubenswrapper[4776]: I1208 10:47:50.904590 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5dnr_885bc336-6858-43f4-b63b-155ed1f06b60/extract-utilities/0.log" Dec 08 10:47:50 crc kubenswrapper[4776]: I1208 10:47:50.934744 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5dnr_885bc336-6858-43f4-b63b-155ed1f06b60/extract-content/0.log" Dec 08 10:47:51 crc kubenswrapper[4776]: I1208 10:47:51.048726 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5dnr_885bc336-6858-43f4-b63b-155ed1f06b60/registry-server/0.log" Dec 08 10:47:58 crc kubenswrapper[4776]: I1208 10:47:58.343944 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:47:58 crc kubenswrapper[4776]: E1208 10:47:58.344873 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:48:03 crc kubenswrapper[4776]: I1208 10:48:03.326113 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-4r72v_3ddae09e-bcfe-4e98-bdd1-9ac94218a6d8/prometheus-operator/0.log" Dec 08 10:48:03 crc kubenswrapper[4776]: I1208 10:48:03.570402 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7994656576-d6jvv_ff9db296-6f02-44bf-810c-48cfb090036e/prometheus-operator-admission-webhook/0.log" Dec 08 10:48:03 crc kubenswrapper[4776]: I1208 10:48:03.649046 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7994656576-gzq75_968acbdd-ab1d-4aa4-9db9-654170c5fa2d/prometheus-operator-admission-webhook/0.log" Dec 08 10:48:03 crc kubenswrapper[4776]: I1208 10:48:03.816328 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-rkf5k_9108512a-718d-41db-b414-02665870be6b/operator/0.log" Dec 08 10:48:03 crc kubenswrapper[4776]: I1208 10:48:03.848876 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-w69rl_251557fb-f870-4b8c-8725-648a8cd97fca/observability-ui-dashboards/0.log" Dec 08 10:48:04 crc kubenswrapper[4776]: I1208 10:48:04.040020 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-bc5qm_5691addb-538a-4212-bb5b-bf797ba7172c/perses-operator/0.log" Dec 08 10:48:12 crc kubenswrapper[4776]: I1208 10:48:12.344612 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:48:12 crc kubenswrapper[4776]: E1208 10:48:12.345419 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:48:17 crc kubenswrapper[4776]: I1208 10:48:17.040816 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6bfc99889d-tq44h_9ba0d9e5-f1ab-40a6-9490-57ce8566843a/kube-rbac-proxy/0.log" Dec 08 10:48:17 crc kubenswrapper[4776]: I1208 10:48:17.123386 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6bfc99889d-tq44h_9ba0d9e5-f1ab-40a6-9490-57ce8566843a/manager/0.log" Dec 08 10:48:25 crc kubenswrapper[4776]: I1208 10:48:25.343896 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:48:25 crc kubenswrapper[4776]: E1208 10:48:25.344681 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:48:40 crc kubenswrapper[4776]: I1208 10:48:40.343653 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:48:40 crc kubenswrapper[4776]: E1208 10:48:40.344433 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:48:52 crc kubenswrapper[4776]: I1208 10:48:52.343875 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:48:52 crc kubenswrapper[4776]: E1208 10:48:52.344937 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:49:07 crc kubenswrapper[4776]: I1208 10:49:07.344620 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:49:07 crc kubenswrapper[4776]: E1208 10:49:07.345407 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:49:22 crc kubenswrapper[4776]: I1208 10:49:22.344385 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:49:22 crc kubenswrapper[4776]: E1208 10:49:22.345054 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:49:37 crc kubenswrapper[4776]: I1208 10:49:37.345686 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:49:37 crc kubenswrapper[4776]: E1208 10:49:37.346413 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:49:50 crc kubenswrapper[4776]: I1208 10:49:50.343721 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:49:50 crc kubenswrapper[4776]: E1208 10:49:50.344606 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:49:53 crc kubenswrapper[4776]: I1208 10:49:53.078882 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-txmws" podUID="aeec433a-2b07-4008-9829-d266f85b5cf1" containerName="registry-server" probeResult="failure" output=< Dec 08 10:49:53 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 10:49:53 crc kubenswrapper[4776]: > Dec 08 10:49:53 crc kubenswrapper[4776]: I1208 10:49:53.080015 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-txmws" podUID="aeec433a-2b07-4008-9829-d266f85b5cf1" containerName="registry-server" probeResult="failure" output=< Dec 08 10:49:53 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 10:49:53 crc kubenswrapper[4776]: > Dec 08 10:50:02 crc kubenswrapper[4776]: I1208 10:50:02.343600 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:50:02 crc kubenswrapper[4776]: E1208 10:50:02.344452 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:50:08 crc kubenswrapper[4776]: I1208 10:50:08.012114 4776 generic.go:334] "Generic (PLEG): container finished" podID="69237141-2b8c-4aa8-9aeb-dfb1ad0b2187" containerID="bb63c7c1c2dd86d0afb119fe94fa1d13246a09d3fc76cc3b49384a07a84b66ab" exitCode=0 Dec 08 10:50:08 crc kubenswrapper[4776]: I1208 10:50:08.012239 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k6x5r/must-gather-cgltv" event={"ID":"69237141-2b8c-4aa8-9aeb-dfb1ad0b2187","Type":"ContainerDied","Data":"bb63c7c1c2dd86d0afb119fe94fa1d13246a09d3fc76cc3b49384a07a84b66ab"} Dec 08 10:50:08 crc kubenswrapper[4776]: I1208 10:50:08.013602 4776 scope.go:117] "RemoveContainer" containerID="bb63c7c1c2dd86d0afb119fe94fa1d13246a09d3fc76cc3b49384a07a84b66ab" Dec 08 10:50:08 crc kubenswrapper[4776]: I1208 10:50:08.498658 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k6x5r_must-gather-cgltv_69237141-2b8c-4aa8-9aeb-dfb1ad0b2187/gather/0.log" Dec 08 10:50:17 crc kubenswrapper[4776]: I1208 10:50:17.344118 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:50:17 crc kubenswrapper[4776]: E1208 10:50:17.345774 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:50:20 crc kubenswrapper[4776]: I1208 10:50:20.673205 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k6x5r/must-gather-cgltv"] Dec 08 10:50:20 crc kubenswrapper[4776]: I1208 10:50:20.674636 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-k6x5r/must-gather-cgltv" podUID="69237141-2b8c-4aa8-9aeb-dfb1ad0b2187" containerName="copy" containerID="cri-o://670f2f07675a75150a4cf46b89a53b32fd045ac9d91ea1777da47ab1b1482509" gracePeriod=2 Dec 08 10:50:20 crc kubenswrapper[4776]: I1208 10:50:20.686221 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k6x5r/must-gather-cgltv"] Dec 08 10:50:21 crc kubenswrapper[4776]: I1208 10:50:21.115126 4776 scope.go:117] "RemoveContainer" containerID="ccbf3c82de929a3357733dbe96de6d16da54bacf22b970a046fcf8d5e928112d" Dec 08 10:50:21 crc kubenswrapper[4776]: I1208 10:50:21.163305 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k6x5r_must-gather-cgltv_69237141-2b8c-4aa8-9aeb-dfb1ad0b2187/copy/0.log" Dec 08 10:50:21 crc kubenswrapper[4776]: I1208 10:50:21.164350 4776 generic.go:334] "Generic (PLEG): container finished" podID="69237141-2b8c-4aa8-9aeb-dfb1ad0b2187" containerID="670f2f07675a75150a4cf46b89a53b32fd045ac9d91ea1777da47ab1b1482509" exitCode=143 Dec 08 10:50:21 crc kubenswrapper[4776]: I1208 10:50:21.164431 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2abd8eb829d15f8361966c4b64085261560bbe6dd18bc78fbaa27eebef9fd5ff" Dec 08 10:50:21 crc kubenswrapper[4776]: I1208 10:50:21.229674 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k6x5r_must-gather-cgltv_69237141-2b8c-4aa8-9aeb-dfb1ad0b2187/copy/0.log" Dec 08 10:50:21 crc kubenswrapper[4776]: I1208 10:50:21.230357 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k6x5r/must-gather-cgltv" Dec 08 10:50:21 crc kubenswrapper[4776]: I1208 10:50:21.338903 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/69237141-2b8c-4aa8-9aeb-dfb1ad0b2187-must-gather-output\") pod \"69237141-2b8c-4aa8-9aeb-dfb1ad0b2187\" (UID: \"69237141-2b8c-4aa8-9aeb-dfb1ad0b2187\") " Dec 08 10:50:21 crc kubenswrapper[4776]: I1208 10:50:21.339133 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlqxn\" (UniqueName: \"kubernetes.io/projected/69237141-2b8c-4aa8-9aeb-dfb1ad0b2187-kube-api-access-rlqxn\") pod \"69237141-2b8c-4aa8-9aeb-dfb1ad0b2187\" (UID: \"69237141-2b8c-4aa8-9aeb-dfb1ad0b2187\") " Dec 08 10:50:21 crc kubenswrapper[4776]: I1208 10:50:21.348213 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69237141-2b8c-4aa8-9aeb-dfb1ad0b2187-kube-api-access-rlqxn" (OuterVolumeSpecName: "kube-api-access-rlqxn") pod "69237141-2b8c-4aa8-9aeb-dfb1ad0b2187" (UID: "69237141-2b8c-4aa8-9aeb-dfb1ad0b2187"). InnerVolumeSpecName "kube-api-access-rlqxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:50:21 crc kubenswrapper[4776]: I1208 10:50:21.443065 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlqxn\" (UniqueName: \"kubernetes.io/projected/69237141-2b8c-4aa8-9aeb-dfb1ad0b2187-kube-api-access-rlqxn\") on node \"crc\" DevicePath \"\"" Dec 08 10:50:21 crc kubenswrapper[4776]: I1208 10:50:21.526978 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69237141-2b8c-4aa8-9aeb-dfb1ad0b2187-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "69237141-2b8c-4aa8-9aeb-dfb1ad0b2187" (UID: "69237141-2b8c-4aa8-9aeb-dfb1ad0b2187"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:50:21 crc kubenswrapper[4776]: I1208 10:50:21.547196 4776 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/69237141-2b8c-4aa8-9aeb-dfb1ad0b2187-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 08 10:50:22 crc kubenswrapper[4776]: I1208 10:50:22.174812 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k6x5r/must-gather-cgltv" Dec 08 10:50:22 crc kubenswrapper[4776]: I1208 10:50:22.358115 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69237141-2b8c-4aa8-9aeb-dfb1ad0b2187" path="/var/lib/kubelet/pods/69237141-2b8c-4aa8-9aeb-dfb1ad0b2187/volumes" Dec 08 10:50:32 crc kubenswrapper[4776]: I1208 10:50:32.343913 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:50:32 crc kubenswrapper[4776]: E1208 10:50:32.344934 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:50:43 crc kubenswrapper[4776]: I1208 10:50:43.343843 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:50:43 crc kubenswrapper[4776]: E1208 10:50:43.344698 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:50:57 crc kubenswrapper[4776]: I1208 10:50:57.344076 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:50:57 crc kubenswrapper[4776]: E1208 10:50:57.345248 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:51:11 crc kubenswrapper[4776]: I1208 10:51:11.344961 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:51:11 crc kubenswrapper[4776]: E1208 10:51:11.346005 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jkmbn_openshift-machine-config-operator(c9788ab1-1031-4103-a769-a4b3177c7268)\"" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.553493 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gqrd8"] Dec 08 10:51:19 crc kubenswrapper[4776]: E1208 10:51:19.554721 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69237141-2b8c-4aa8-9aeb-dfb1ad0b2187" containerName="copy" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.554739 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="69237141-2b8c-4aa8-9aeb-dfb1ad0b2187" containerName="copy" Dec 08 10:51:19 crc kubenswrapper[4776]: E1208 10:51:19.554757 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6b02b2-693e-4508-8248-0e5b0a38e313" containerName="extract-utilities" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.554764 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6b02b2-693e-4508-8248-0e5b0a38e313" containerName="extract-utilities" Dec 08 10:51:19 crc kubenswrapper[4776]: E1208 10:51:19.554792 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6b02b2-693e-4508-8248-0e5b0a38e313" containerName="registry-server" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.554800 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6b02b2-693e-4508-8248-0e5b0a38e313" containerName="registry-server" Dec 08 10:51:19 crc kubenswrapper[4776]: E1208 10:51:19.554813 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6b02b2-693e-4508-8248-0e5b0a38e313" containerName="extract-content" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.554820 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6b02b2-693e-4508-8248-0e5b0a38e313" containerName="extract-content" Dec 08 10:51:19 crc kubenswrapper[4776]: E1208 10:51:19.554870 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69237141-2b8c-4aa8-9aeb-dfb1ad0b2187" containerName="gather" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.554877 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="69237141-2b8c-4aa8-9aeb-dfb1ad0b2187" containerName="gather" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.555217 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="69237141-2b8c-4aa8-9aeb-dfb1ad0b2187" containerName="copy" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.555250 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba6b02b2-693e-4508-8248-0e5b0a38e313" containerName="registry-server" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.555272 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="69237141-2b8c-4aa8-9aeb-dfb1ad0b2187" containerName="gather" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.557675 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.578053 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gqrd8"] Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.641140 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlj7r\" (UniqueName: \"kubernetes.io/projected/63e84fe3-9e7c-42d6-869d-6e741603da2a-kube-api-access-dlj7r\") pod \"community-operators-gqrd8\" (UID: \"63e84fe3-9e7c-42d6-869d-6e741603da2a\") " pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.641512 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e84fe3-9e7c-42d6-869d-6e741603da2a-catalog-content\") pod \"community-operators-gqrd8\" (UID: \"63e84fe3-9e7c-42d6-869d-6e741603da2a\") " pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.641602 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e84fe3-9e7c-42d6-869d-6e741603da2a-utilities\") pod \"community-operators-gqrd8\" (UID: \"63e84fe3-9e7c-42d6-869d-6e741603da2a\") " pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.743627 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlj7r\" (UniqueName: \"kubernetes.io/projected/63e84fe3-9e7c-42d6-869d-6e741603da2a-kube-api-access-dlj7r\") pod \"community-operators-gqrd8\" (UID: \"63e84fe3-9e7c-42d6-869d-6e741603da2a\") " pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.743689 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e84fe3-9e7c-42d6-869d-6e741603da2a-catalog-content\") pod \"community-operators-gqrd8\" (UID: \"63e84fe3-9e7c-42d6-869d-6e741603da2a\") " pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.743770 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e84fe3-9e7c-42d6-869d-6e741603da2a-utilities\") pod \"community-operators-gqrd8\" (UID: \"63e84fe3-9e7c-42d6-869d-6e741603da2a\") " pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.744336 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e84fe3-9e7c-42d6-869d-6e741603da2a-utilities\") pod \"community-operators-gqrd8\" (UID: \"63e84fe3-9e7c-42d6-869d-6e741603da2a\") " pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.744400 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e84fe3-9e7c-42d6-869d-6e741603da2a-catalog-content\") pod \"community-operators-gqrd8\" (UID: \"63e84fe3-9e7c-42d6-869d-6e741603da2a\") " pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.763050 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlj7r\" (UniqueName: \"kubernetes.io/projected/63e84fe3-9e7c-42d6-869d-6e741603da2a-kube-api-access-dlj7r\") pod \"community-operators-gqrd8\" (UID: \"63e84fe3-9e7c-42d6-869d-6e741603da2a\") " pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:19 crc kubenswrapper[4776]: I1208 10:51:19.889067 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:20 crc kubenswrapper[4776]: I1208 10:51:20.917972 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gqrd8"] Dec 08 10:51:21 crc kubenswrapper[4776]: I1208 10:51:21.218013 4776 scope.go:117] "RemoveContainer" containerID="f8bd2f26b2d7f29e22240541c13505dbde6cc141a12009d4da7c61734a098b0b" Dec 08 10:51:21 crc kubenswrapper[4776]: I1208 10:51:21.237714 4776 scope.go:117] "RemoveContainer" containerID="bb63c7c1c2dd86d0afb119fe94fa1d13246a09d3fc76cc3b49384a07a84b66ab" Dec 08 10:51:21 crc kubenswrapper[4776]: I1208 10:51:21.311831 4776 scope.go:117] "RemoveContainer" containerID="670f2f07675a75150a4cf46b89a53b32fd045ac9d91ea1777da47ab1b1482509" Dec 08 10:51:21 crc kubenswrapper[4776]: I1208 10:51:21.582877 4776 generic.go:334] "Generic (PLEG): container finished" podID="63e84fe3-9e7c-42d6-869d-6e741603da2a" containerID="26d6937b6c4f4ecd1a9800fe6db05701160d062755204c6f46a10bd0f2efa694" exitCode=0 Dec 08 10:51:21 crc kubenswrapper[4776]: I1208 10:51:21.582925 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqrd8" event={"ID":"63e84fe3-9e7c-42d6-869d-6e741603da2a","Type":"ContainerDied","Data":"26d6937b6c4f4ecd1a9800fe6db05701160d062755204c6f46a10bd0f2efa694"} Dec 08 10:51:21 crc kubenswrapper[4776]: I1208 10:51:21.582972 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqrd8" event={"ID":"63e84fe3-9e7c-42d6-869d-6e741603da2a","Type":"ContainerStarted","Data":"edf426511bcc7b4b5f1a83e732abba30399fe9456d96dcb7ec2d4de070d40d81"} Dec 08 10:51:21 crc kubenswrapper[4776]: I1208 10:51:21.585647 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 10:51:22 crc kubenswrapper[4776]: I1208 10:51:22.344405 4776 scope.go:117] "RemoveContainer" containerID="e77f61c54751c78fce5e444bed924b95086a9bf8d6036029988def8a32ebeb2b" Dec 08 10:51:22 crc kubenswrapper[4776]: I1208 10:51:22.598859 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" event={"ID":"c9788ab1-1031-4103-a769-a4b3177c7268","Type":"ContainerStarted","Data":"93ebb27b6099ca8dd614c6d7fc49afab77828f4f3e539bdfabec4090da89320d"} Dec 08 10:51:22 crc kubenswrapper[4776]: I1208 10:51:22.601119 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqrd8" event={"ID":"63e84fe3-9e7c-42d6-869d-6e741603da2a","Type":"ContainerStarted","Data":"7559ab1a81d2f171ad5c516484a5880836504ee779ccd0da132f6c91a3f22290"} Dec 08 10:51:23 crc kubenswrapper[4776]: I1208 10:51:23.614363 4776 generic.go:334] "Generic (PLEG): container finished" podID="63e84fe3-9e7c-42d6-869d-6e741603da2a" containerID="7559ab1a81d2f171ad5c516484a5880836504ee779ccd0da132f6c91a3f22290" exitCode=0 Dec 08 10:51:23 crc kubenswrapper[4776]: I1208 10:51:23.614602 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqrd8" event={"ID":"63e84fe3-9e7c-42d6-869d-6e741603da2a","Type":"ContainerDied","Data":"7559ab1a81d2f171ad5c516484a5880836504ee779ccd0da132f6c91a3f22290"} Dec 08 10:51:24 crc kubenswrapper[4776]: I1208 10:51:24.634730 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqrd8" event={"ID":"63e84fe3-9e7c-42d6-869d-6e741603da2a","Type":"ContainerStarted","Data":"2b6f617b0846247f1a1709a03e5c162e2f28e2725a458331644b2154f8a46faf"} Dec 08 10:51:24 crc kubenswrapper[4776]: I1208 10:51:24.653564 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gqrd8" podStartSLOduration=3.196556993 podStartE2EDuration="5.653545596s" podCreationTimestamp="2025-12-08 10:51:19 +0000 UTC" firstStartedPulling="2025-12-08 10:51:21.584978298 +0000 UTC m=+6757.848203320" lastFinishedPulling="2025-12-08 10:51:24.041966901 +0000 UTC m=+6760.305191923" observedRunningTime="2025-12-08 10:51:24.653453283 +0000 UTC m=+6760.916678315" watchObservedRunningTime="2025-12-08 10:51:24.653545596 +0000 UTC m=+6760.916770618" Dec 08 10:51:29 crc kubenswrapper[4776]: I1208 10:51:29.890057 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:29 crc kubenswrapper[4776]: I1208 10:51:29.890717 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:29 crc kubenswrapper[4776]: I1208 10:51:29.941522 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:30 crc kubenswrapper[4776]: I1208 10:51:30.783769 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:30 crc kubenswrapper[4776]: I1208 10:51:30.863077 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gqrd8"] Dec 08 10:51:32 crc kubenswrapper[4776]: I1208 10:51:32.720112 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gqrd8" podUID="63e84fe3-9e7c-42d6-869d-6e741603da2a" containerName="registry-server" containerID="cri-o://2b6f617b0846247f1a1709a03e5c162e2f28e2725a458331644b2154f8a46faf" gracePeriod=2 Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.235003 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.370000 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlj7r\" (UniqueName: \"kubernetes.io/projected/63e84fe3-9e7c-42d6-869d-6e741603da2a-kube-api-access-dlj7r\") pod \"63e84fe3-9e7c-42d6-869d-6e741603da2a\" (UID: \"63e84fe3-9e7c-42d6-869d-6e741603da2a\") " Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.370167 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e84fe3-9e7c-42d6-869d-6e741603da2a-utilities\") pod \"63e84fe3-9e7c-42d6-869d-6e741603da2a\" (UID: \"63e84fe3-9e7c-42d6-869d-6e741603da2a\") " Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.370244 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e84fe3-9e7c-42d6-869d-6e741603da2a-catalog-content\") pod \"63e84fe3-9e7c-42d6-869d-6e741603da2a\" (UID: \"63e84fe3-9e7c-42d6-869d-6e741603da2a\") " Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.371257 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63e84fe3-9e7c-42d6-869d-6e741603da2a-utilities" (OuterVolumeSpecName: "utilities") pod "63e84fe3-9e7c-42d6-869d-6e741603da2a" (UID: "63e84fe3-9e7c-42d6-869d-6e741603da2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.378835 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63e84fe3-9e7c-42d6-869d-6e741603da2a-kube-api-access-dlj7r" (OuterVolumeSpecName: "kube-api-access-dlj7r") pod "63e84fe3-9e7c-42d6-869d-6e741603da2a" (UID: "63e84fe3-9e7c-42d6-869d-6e741603da2a"). InnerVolumeSpecName "kube-api-access-dlj7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.425081 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63e84fe3-9e7c-42d6-869d-6e741603da2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63e84fe3-9e7c-42d6-869d-6e741603da2a" (UID: "63e84fe3-9e7c-42d6-869d-6e741603da2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.473675 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e84fe3-9e7c-42d6-869d-6e741603da2a-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.473923 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e84fe3-9e7c-42d6-869d-6e741603da2a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.473997 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlj7r\" (UniqueName: \"kubernetes.io/projected/63e84fe3-9e7c-42d6-869d-6e741603da2a-kube-api-access-dlj7r\") on node \"crc\" DevicePath \"\"" Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.737901 4776 generic.go:334] "Generic (PLEG): container finished" podID="63e84fe3-9e7c-42d6-869d-6e741603da2a" containerID="2b6f617b0846247f1a1709a03e5c162e2f28e2725a458331644b2154f8a46faf" exitCode=0 Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.737959 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqrd8" event={"ID":"63e84fe3-9e7c-42d6-869d-6e741603da2a","Type":"ContainerDied","Data":"2b6f617b0846247f1a1709a03e5c162e2f28e2725a458331644b2154f8a46faf"} Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.738051 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqrd8" event={"ID":"63e84fe3-9e7c-42d6-869d-6e741603da2a","Type":"ContainerDied","Data":"edf426511bcc7b4b5f1a83e732abba30399fe9456d96dcb7ec2d4de070d40d81"} Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.738091 4776 scope.go:117] "RemoveContainer" containerID="2b6f617b0846247f1a1709a03e5c162e2f28e2725a458331644b2154f8a46faf" Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.739240 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqrd8" Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.775712 4776 scope.go:117] "RemoveContainer" containerID="7559ab1a81d2f171ad5c516484a5880836504ee779ccd0da132f6c91a3f22290" Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.792615 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gqrd8"] Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.806331 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gqrd8"] Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.830913 4776 scope.go:117] "RemoveContainer" containerID="26d6937b6c4f4ecd1a9800fe6db05701160d062755204c6f46a10bd0f2efa694" Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.873911 4776 scope.go:117] "RemoveContainer" containerID="2b6f617b0846247f1a1709a03e5c162e2f28e2725a458331644b2154f8a46faf" Dec 08 10:51:33 crc kubenswrapper[4776]: E1208 10:51:33.874605 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b6f617b0846247f1a1709a03e5c162e2f28e2725a458331644b2154f8a46faf\": container with ID starting with 2b6f617b0846247f1a1709a03e5c162e2f28e2725a458331644b2154f8a46faf not found: ID does not exist" containerID="2b6f617b0846247f1a1709a03e5c162e2f28e2725a458331644b2154f8a46faf" Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.874665 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b6f617b0846247f1a1709a03e5c162e2f28e2725a458331644b2154f8a46faf"} err="failed to get container status \"2b6f617b0846247f1a1709a03e5c162e2f28e2725a458331644b2154f8a46faf\": rpc error: code = NotFound desc = could not find container \"2b6f617b0846247f1a1709a03e5c162e2f28e2725a458331644b2154f8a46faf\": container with ID starting with 2b6f617b0846247f1a1709a03e5c162e2f28e2725a458331644b2154f8a46faf not found: ID does not exist" Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.874704 4776 scope.go:117] "RemoveContainer" containerID="7559ab1a81d2f171ad5c516484a5880836504ee779ccd0da132f6c91a3f22290" Dec 08 10:51:33 crc kubenswrapper[4776]: E1208 10:51:33.875320 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7559ab1a81d2f171ad5c516484a5880836504ee779ccd0da132f6c91a3f22290\": container with ID starting with 7559ab1a81d2f171ad5c516484a5880836504ee779ccd0da132f6c91a3f22290 not found: ID does not exist" containerID="7559ab1a81d2f171ad5c516484a5880836504ee779ccd0da132f6c91a3f22290" Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.875376 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7559ab1a81d2f171ad5c516484a5880836504ee779ccd0da132f6c91a3f22290"} err="failed to get container status \"7559ab1a81d2f171ad5c516484a5880836504ee779ccd0da132f6c91a3f22290\": rpc error: code = NotFound desc = could not find container \"7559ab1a81d2f171ad5c516484a5880836504ee779ccd0da132f6c91a3f22290\": container with ID starting with 7559ab1a81d2f171ad5c516484a5880836504ee779ccd0da132f6c91a3f22290 not found: ID does not exist" Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.875413 4776 scope.go:117] "RemoveContainer" containerID="26d6937b6c4f4ecd1a9800fe6db05701160d062755204c6f46a10bd0f2efa694" Dec 08 10:51:33 crc kubenswrapper[4776]: E1208 10:51:33.875827 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d6937b6c4f4ecd1a9800fe6db05701160d062755204c6f46a10bd0f2efa694\": container with ID starting with 26d6937b6c4f4ecd1a9800fe6db05701160d062755204c6f46a10bd0f2efa694 not found: ID does not exist" containerID="26d6937b6c4f4ecd1a9800fe6db05701160d062755204c6f46a10bd0f2efa694" Dec 08 10:51:33 crc kubenswrapper[4776]: I1208 10:51:33.875861 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d6937b6c4f4ecd1a9800fe6db05701160d062755204c6f46a10bd0f2efa694"} err="failed to get container status \"26d6937b6c4f4ecd1a9800fe6db05701160d062755204c6f46a10bd0f2efa694\": rpc error: code = NotFound desc = could not find container \"26d6937b6c4f4ecd1a9800fe6db05701160d062755204c6f46a10bd0f2efa694\": container with ID starting with 26d6937b6c4f4ecd1a9800fe6db05701160d062755204c6f46a10bd0f2efa694 not found: ID does not exist" Dec 08 10:51:34 crc kubenswrapper[4776]: I1208 10:51:34.364659 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63e84fe3-9e7c-42d6-869d-6e741603da2a" path="/var/lib/kubelet/pods/63e84fe3-9e7c-42d6-869d-6e741603da2a/volumes" Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.414983 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m2dg7"] Dec 08 10:52:01 crc kubenswrapper[4776]: E1208 10:52:01.416103 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e84fe3-9e7c-42d6-869d-6e741603da2a" containerName="extract-content" Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.416116 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e84fe3-9e7c-42d6-869d-6e741603da2a" containerName="extract-content" Dec 08 10:52:01 crc kubenswrapper[4776]: E1208 10:52:01.416162 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e84fe3-9e7c-42d6-869d-6e741603da2a" containerName="registry-server" Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.416182 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e84fe3-9e7c-42d6-869d-6e741603da2a" containerName="registry-server" Dec 08 10:52:01 crc kubenswrapper[4776]: E1208 10:52:01.416197 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e84fe3-9e7c-42d6-869d-6e741603da2a" containerName="extract-utilities" Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.416203 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e84fe3-9e7c-42d6-869d-6e741603da2a" containerName="extract-utilities" Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.416427 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="63e84fe3-9e7c-42d6-869d-6e741603da2a" containerName="registry-server" Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.418127 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.431382 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m2dg7"] Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.543264 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a17929b-796a-4c03-8d86-debb138cb856-utilities\") pod \"certified-operators-m2dg7\" (UID: \"4a17929b-796a-4c03-8d86-debb138cb856\") " pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.543393 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wllbz\" (UniqueName: \"kubernetes.io/projected/4a17929b-796a-4c03-8d86-debb138cb856-kube-api-access-wllbz\") pod \"certified-operators-m2dg7\" (UID: \"4a17929b-796a-4c03-8d86-debb138cb856\") " pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.543513 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a17929b-796a-4c03-8d86-debb138cb856-catalog-content\") pod \"certified-operators-m2dg7\" (UID: \"4a17929b-796a-4c03-8d86-debb138cb856\") " pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.646019 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a17929b-796a-4c03-8d86-debb138cb856-catalog-content\") pod \"certified-operators-m2dg7\" (UID: \"4a17929b-796a-4c03-8d86-debb138cb856\") " pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.646234 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a17929b-796a-4c03-8d86-debb138cb856-utilities\") pod \"certified-operators-m2dg7\" (UID: \"4a17929b-796a-4c03-8d86-debb138cb856\") " pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.646366 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wllbz\" (UniqueName: \"kubernetes.io/projected/4a17929b-796a-4c03-8d86-debb138cb856-kube-api-access-wllbz\") pod \"certified-operators-m2dg7\" (UID: \"4a17929b-796a-4c03-8d86-debb138cb856\") " pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.646557 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a17929b-796a-4c03-8d86-debb138cb856-catalog-content\") pod \"certified-operators-m2dg7\" (UID: \"4a17929b-796a-4c03-8d86-debb138cb856\") " pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.646665 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a17929b-796a-4c03-8d86-debb138cb856-utilities\") pod \"certified-operators-m2dg7\" (UID: \"4a17929b-796a-4c03-8d86-debb138cb856\") " pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.669909 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wllbz\" (UniqueName: \"kubernetes.io/projected/4a17929b-796a-4c03-8d86-debb138cb856-kube-api-access-wllbz\") pod \"certified-operators-m2dg7\" (UID: \"4a17929b-796a-4c03-8d86-debb138cb856\") " pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:01 crc kubenswrapper[4776]: I1208 10:52:01.742507 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:02 crc kubenswrapper[4776]: I1208 10:52:02.223770 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m2dg7"] Dec 08 10:52:03 crc kubenswrapper[4776]: I1208 10:52:03.045327 4776 generic.go:334] "Generic (PLEG): container finished" podID="4a17929b-796a-4c03-8d86-debb138cb856" containerID="a623beca03c8d66c1226393e1eef6a0f8a3a073224d28f23a3d1cfec199e754c" exitCode=0 Dec 08 10:52:03 crc kubenswrapper[4776]: I1208 10:52:03.045569 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2dg7" event={"ID":"4a17929b-796a-4c03-8d86-debb138cb856","Type":"ContainerDied","Data":"a623beca03c8d66c1226393e1eef6a0f8a3a073224d28f23a3d1cfec199e754c"} Dec 08 10:52:03 crc kubenswrapper[4776]: I1208 10:52:03.045594 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2dg7" event={"ID":"4a17929b-796a-4c03-8d86-debb138cb856","Type":"ContainerStarted","Data":"2bc2ba7d538cea04502b64b2c41130cce56941a3dda5a0edcbb5472bfd800f22"} Dec 08 10:52:04 crc kubenswrapper[4776]: I1208 10:52:04.067978 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2dg7" event={"ID":"4a17929b-796a-4c03-8d86-debb138cb856","Type":"ContainerStarted","Data":"c1d972715d94dea6597d0906000593e88f5b3056e7b74ef2551dfbb18f82973d"} Dec 08 10:52:04 crc kubenswrapper[4776]: I1208 10:52:04.614700 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5hqfp"] Dec 08 10:52:04 crc kubenswrapper[4776]: I1208 10:52:04.617556 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:04 crc kubenswrapper[4776]: I1208 10:52:04.628346 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5hqfp"] Dec 08 10:52:04 crc kubenswrapper[4776]: I1208 10:52:04.707250 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-catalog-content\") pod \"redhat-operators-5hqfp\" (UID: \"0c650e4f-f976-4aeb-8eb0-7624e89adc4c\") " pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:04 crc kubenswrapper[4776]: I1208 10:52:04.707349 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-utilities\") pod \"redhat-operators-5hqfp\" (UID: \"0c650e4f-f976-4aeb-8eb0-7624e89adc4c\") " pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:04 crc kubenswrapper[4776]: I1208 10:52:04.707444 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chs86\" (UniqueName: \"kubernetes.io/projected/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-kube-api-access-chs86\") pod \"redhat-operators-5hqfp\" (UID: \"0c650e4f-f976-4aeb-8eb0-7624e89adc4c\") " pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:04 crc kubenswrapper[4776]: I1208 10:52:04.810012 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-catalog-content\") pod \"redhat-operators-5hqfp\" (UID: \"0c650e4f-f976-4aeb-8eb0-7624e89adc4c\") " pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:04 crc kubenswrapper[4776]: I1208 10:52:04.810106 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-utilities\") pod \"redhat-operators-5hqfp\" (UID: \"0c650e4f-f976-4aeb-8eb0-7624e89adc4c\") " pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:04 crc kubenswrapper[4776]: I1208 10:52:04.810212 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chs86\" (UniqueName: \"kubernetes.io/projected/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-kube-api-access-chs86\") pod \"redhat-operators-5hqfp\" (UID: \"0c650e4f-f976-4aeb-8eb0-7624e89adc4c\") " pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:04 crc kubenswrapper[4776]: I1208 10:52:04.810678 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-utilities\") pod \"redhat-operators-5hqfp\" (UID: \"0c650e4f-f976-4aeb-8eb0-7624e89adc4c\") " pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:04 crc kubenswrapper[4776]: I1208 10:52:04.811432 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-catalog-content\") pod \"redhat-operators-5hqfp\" (UID: \"0c650e4f-f976-4aeb-8eb0-7624e89adc4c\") " pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:04 crc kubenswrapper[4776]: I1208 10:52:04.832681 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chs86\" (UniqueName: \"kubernetes.io/projected/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-kube-api-access-chs86\") pod \"redhat-operators-5hqfp\" (UID: \"0c650e4f-f976-4aeb-8eb0-7624e89adc4c\") " pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:04 crc kubenswrapper[4776]: I1208 10:52:04.940731 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:05 crc kubenswrapper[4776]: I1208 10:52:05.460804 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5hqfp"] Dec 08 10:52:06 crc kubenswrapper[4776]: I1208 10:52:06.096008 4776 generic.go:334] "Generic (PLEG): container finished" podID="4a17929b-796a-4c03-8d86-debb138cb856" containerID="c1d972715d94dea6597d0906000593e88f5b3056e7b74ef2551dfbb18f82973d" exitCode=0 Dec 08 10:52:06 crc kubenswrapper[4776]: I1208 10:52:06.096097 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2dg7" event={"ID":"4a17929b-796a-4c03-8d86-debb138cb856","Type":"ContainerDied","Data":"c1d972715d94dea6597d0906000593e88f5b3056e7b74ef2551dfbb18f82973d"} Dec 08 10:52:06 crc kubenswrapper[4776]: I1208 10:52:06.100560 4776 generic.go:334] "Generic (PLEG): container finished" podID="0c650e4f-f976-4aeb-8eb0-7624e89adc4c" containerID="6b2cced9f828dc8619a426319c2632a847e7bdc8ad04237f572a87b94a00e8d1" exitCode=0 Dec 08 10:52:06 crc kubenswrapper[4776]: I1208 10:52:06.100603 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5hqfp" event={"ID":"0c650e4f-f976-4aeb-8eb0-7624e89adc4c","Type":"ContainerDied","Data":"6b2cced9f828dc8619a426319c2632a847e7bdc8ad04237f572a87b94a00e8d1"} Dec 08 10:52:06 crc kubenswrapper[4776]: I1208 10:52:06.100630 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5hqfp" event={"ID":"0c650e4f-f976-4aeb-8eb0-7624e89adc4c","Type":"ContainerStarted","Data":"b0943536be243753bb7d4d0ff03152919d6e74155b3d0730c6f7b8b70f1e5641"} Dec 08 10:52:07 crc kubenswrapper[4776]: I1208 10:52:07.113969 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5hqfp" event={"ID":"0c650e4f-f976-4aeb-8eb0-7624e89adc4c","Type":"ContainerStarted","Data":"a9d2bb3aaf2eff2604a2fc06a4d48219452913735b64007ae061cb43fed70915"} Dec 08 10:52:07 crc kubenswrapper[4776]: I1208 10:52:07.116688 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2dg7" event={"ID":"4a17929b-796a-4c03-8d86-debb138cb856","Type":"ContainerStarted","Data":"3624e4a7f25bc4a8bb6cc2aa871e60f93579abfb0c712cb7fc4491dced8a36cf"} Dec 08 10:52:07 crc kubenswrapper[4776]: I1208 10:52:07.169147 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m2dg7" podStartSLOduration=2.717550276 podStartE2EDuration="6.169129193s" podCreationTimestamp="2025-12-08 10:52:01 +0000 UTC" firstStartedPulling="2025-12-08 10:52:03.050115633 +0000 UTC m=+6799.313340655" lastFinishedPulling="2025-12-08 10:52:06.50169455 +0000 UTC m=+6802.764919572" observedRunningTime="2025-12-08 10:52:07.164716742 +0000 UTC m=+6803.427941754" watchObservedRunningTime="2025-12-08 10:52:07.169129193 +0000 UTC m=+6803.432354215" Dec 08 10:52:10 crc kubenswrapper[4776]: I1208 10:52:10.152265 4776 generic.go:334] "Generic (PLEG): container finished" podID="0c650e4f-f976-4aeb-8eb0-7624e89adc4c" containerID="a9d2bb3aaf2eff2604a2fc06a4d48219452913735b64007ae061cb43fed70915" exitCode=0 Dec 08 10:52:10 crc kubenswrapper[4776]: I1208 10:52:10.152340 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5hqfp" event={"ID":"0c650e4f-f976-4aeb-8eb0-7624e89adc4c","Type":"ContainerDied","Data":"a9d2bb3aaf2eff2604a2fc06a4d48219452913735b64007ae061cb43fed70915"} Dec 08 10:52:11 crc kubenswrapper[4776]: I1208 10:52:11.165622 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5hqfp" event={"ID":"0c650e4f-f976-4aeb-8eb0-7624e89adc4c","Type":"ContainerStarted","Data":"a43b65568dba5d5e5eb671edd83b1d1a0afec09ee738021374caf7acf3fb2b31"} Dec 08 10:52:11 crc kubenswrapper[4776]: I1208 10:52:11.193428 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5hqfp" podStartSLOduration=2.636541514 podStartE2EDuration="7.193406624s" podCreationTimestamp="2025-12-08 10:52:04 +0000 UTC" firstStartedPulling="2025-12-08 10:52:06.105212559 +0000 UTC m=+6802.368437581" lastFinishedPulling="2025-12-08 10:52:10.662077669 +0000 UTC m=+6806.925302691" observedRunningTime="2025-12-08 10:52:11.182899795 +0000 UTC m=+6807.446124817" watchObservedRunningTime="2025-12-08 10:52:11.193406624 +0000 UTC m=+6807.456631646" Dec 08 10:52:11 crc kubenswrapper[4776]: I1208 10:52:11.743328 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:11 crc kubenswrapper[4776]: I1208 10:52:11.743588 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:12 crc kubenswrapper[4776]: I1208 10:52:12.791987 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-m2dg7" podUID="4a17929b-796a-4c03-8d86-debb138cb856" containerName="registry-server" probeResult="failure" output=< Dec 08 10:52:12 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 10:52:12 crc kubenswrapper[4776]: > Dec 08 10:52:14 crc kubenswrapper[4776]: I1208 10:52:14.941395 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:14 crc kubenswrapper[4776]: I1208 10:52:14.941807 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:15 crc kubenswrapper[4776]: I1208 10:52:15.989125 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5hqfp" podUID="0c650e4f-f976-4aeb-8eb0-7624e89adc4c" containerName="registry-server" probeResult="failure" output=< Dec 08 10:52:15 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 08 10:52:15 crc kubenswrapper[4776]: > Dec 08 10:52:21 crc kubenswrapper[4776]: I1208 10:52:21.791872 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:21 crc kubenswrapper[4776]: I1208 10:52:21.842189 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:22 crc kubenswrapper[4776]: I1208 10:52:22.031795 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m2dg7"] Dec 08 10:52:23 crc kubenswrapper[4776]: I1208 10:52:23.289271 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m2dg7" podUID="4a17929b-796a-4c03-8d86-debb138cb856" containerName="registry-server" containerID="cri-o://3624e4a7f25bc4a8bb6cc2aa871e60f93579abfb0c712cb7fc4491dced8a36cf" gracePeriod=2 Dec 08 10:52:23 crc kubenswrapper[4776]: I1208 10:52:23.816727 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:23 crc kubenswrapper[4776]: I1208 10:52:23.858943 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a17929b-796a-4c03-8d86-debb138cb856-utilities\") pod \"4a17929b-796a-4c03-8d86-debb138cb856\" (UID: \"4a17929b-796a-4c03-8d86-debb138cb856\") " Dec 08 10:52:23 crc kubenswrapper[4776]: I1208 10:52:23.859304 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a17929b-796a-4c03-8d86-debb138cb856-catalog-content\") pod \"4a17929b-796a-4c03-8d86-debb138cb856\" (UID: \"4a17929b-796a-4c03-8d86-debb138cb856\") " Dec 08 10:52:23 crc kubenswrapper[4776]: I1208 10:52:23.859385 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wllbz\" (UniqueName: \"kubernetes.io/projected/4a17929b-796a-4c03-8d86-debb138cb856-kube-api-access-wllbz\") pod \"4a17929b-796a-4c03-8d86-debb138cb856\" (UID: \"4a17929b-796a-4c03-8d86-debb138cb856\") " Dec 08 10:52:23 crc kubenswrapper[4776]: I1208 10:52:23.860472 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a17929b-796a-4c03-8d86-debb138cb856-utilities" (OuterVolumeSpecName: "utilities") pod "4a17929b-796a-4c03-8d86-debb138cb856" (UID: "4a17929b-796a-4c03-8d86-debb138cb856"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:52:23 crc kubenswrapper[4776]: I1208 10:52:23.866927 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a17929b-796a-4c03-8d86-debb138cb856-kube-api-access-wllbz" (OuterVolumeSpecName: "kube-api-access-wllbz") pod "4a17929b-796a-4c03-8d86-debb138cb856" (UID: "4a17929b-796a-4c03-8d86-debb138cb856"). InnerVolumeSpecName "kube-api-access-wllbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:52:23 crc kubenswrapper[4776]: I1208 10:52:23.910009 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a17929b-796a-4c03-8d86-debb138cb856-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a17929b-796a-4c03-8d86-debb138cb856" (UID: "4a17929b-796a-4c03-8d86-debb138cb856"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:52:23 crc kubenswrapper[4776]: I1208 10:52:23.963006 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wllbz\" (UniqueName: \"kubernetes.io/projected/4a17929b-796a-4c03-8d86-debb138cb856-kube-api-access-wllbz\") on node \"crc\" DevicePath \"\"" Dec 08 10:52:23 crc kubenswrapper[4776]: I1208 10:52:23.963043 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a17929b-796a-4c03-8d86-debb138cb856-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:52:23 crc kubenswrapper[4776]: I1208 10:52:23.963066 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a17929b-796a-4c03-8d86-debb138cb856-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:52:24 crc kubenswrapper[4776]: I1208 10:52:24.302554 4776 generic.go:334] "Generic (PLEG): container finished" podID="4a17929b-796a-4c03-8d86-debb138cb856" containerID="3624e4a7f25bc4a8bb6cc2aa871e60f93579abfb0c712cb7fc4491dced8a36cf" exitCode=0 Dec 08 10:52:24 crc kubenswrapper[4776]: I1208 10:52:24.302625 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2dg7" event={"ID":"4a17929b-796a-4c03-8d86-debb138cb856","Type":"ContainerDied","Data":"3624e4a7f25bc4a8bb6cc2aa871e60f93579abfb0c712cb7fc4491dced8a36cf"} Dec 08 10:52:24 crc kubenswrapper[4776]: I1208 10:52:24.302660 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m2dg7" Dec 08 10:52:24 crc kubenswrapper[4776]: I1208 10:52:24.302685 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2dg7" event={"ID":"4a17929b-796a-4c03-8d86-debb138cb856","Type":"ContainerDied","Data":"2bc2ba7d538cea04502b64b2c41130cce56941a3dda5a0edcbb5472bfd800f22"} Dec 08 10:52:24 crc kubenswrapper[4776]: I1208 10:52:24.302708 4776 scope.go:117] "RemoveContainer" containerID="3624e4a7f25bc4a8bb6cc2aa871e60f93579abfb0c712cb7fc4491dced8a36cf" Dec 08 10:52:24 crc kubenswrapper[4776]: I1208 10:52:24.333080 4776 scope.go:117] "RemoveContainer" containerID="c1d972715d94dea6597d0906000593e88f5b3056e7b74ef2551dfbb18f82973d" Dec 08 10:52:24 crc kubenswrapper[4776]: I1208 10:52:24.379413 4776 scope.go:117] "RemoveContainer" containerID="a623beca03c8d66c1226393e1eef6a0f8a3a073224d28f23a3d1cfec199e754c" Dec 08 10:52:24 crc kubenswrapper[4776]: I1208 10:52:24.386116 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m2dg7"] Dec 08 10:52:24 crc kubenswrapper[4776]: I1208 10:52:24.414674 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m2dg7"] Dec 08 10:52:24 crc kubenswrapper[4776]: I1208 10:52:24.417887 4776 scope.go:117] "RemoveContainer" containerID="3624e4a7f25bc4a8bb6cc2aa871e60f93579abfb0c712cb7fc4491dced8a36cf" Dec 08 10:52:24 crc kubenswrapper[4776]: E1208 10:52:24.418321 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3624e4a7f25bc4a8bb6cc2aa871e60f93579abfb0c712cb7fc4491dced8a36cf\": container with ID starting with 3624e4a7f25bc4a8bb6cc2aa871e60f93579abfb0c712cb7fc4491dced8a36cf not found: ID does not exist" containerID="3624e4a7f25bc4a8bb6cc2aa871e60f93579abfb0c712cb7fc4491dced8a36cf" Dec 08 10:52:24 crc kubenswrapper[4776]: I1208 10:52:24.418366 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3624e4a7f25bc4a8bb6cc2aa871e60f93579abfb0c712cb7fc4491dced8a36cf"} err="failed to get container status \"3624e4a7f25bc4a8bb6cc2aa871e60f93579abfb0c712cb7fc4491dced8a36cf\": rpc error: code = NotFound desc = could not find container \"3624e4a7f25bc4a8bb6cc2aa871e60f93579abfb0c712cb7fc4491dced8a36cf\": container with ID starting with 3624e4a7f25bc4a8bb6cc2aa871e60f93579abfb0c712cb7fc4491dced8a36cf not found: ID does not exist" Dec 08 10:52:24 crc kubenswrapper[4776]: I1208 10:52:24.418392 4776 scope.go:117] "RemoveContainer" containerID="c1d972715d94dea6597d0906000593e88f5b3056e7b74ef2551dfbb18f82973d" Dec 08 10:52:24 crc kubenswrapper[4776]: E1208 10:52:24.418661 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1d972715d94dea6597d0906000593e88f5b3056e7b74ef2551dfbb18f82973d\": container with ID starting with c1d972715d94dea6597d0906000593e88f5b3056e7b74ef2551dfbb18f82973d not found: ID does not exist" containerID="c1d972715d94dea6597d0906000593e88f5b3056e7b74ef2551dfbb18f82973d" Dec 08 10:52:24 crc kubenswrapper[4776]: I1208 10:52:24.418708 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1d972715d94dea6597d0906000593e88f5b3056e7b74ef2551dfbb18f82973d"} err="failed to get container status \"c1d972715d94dea6597d0906000593e88f5b3056e7b74ef2551dfbb18f82973d\": rpc error: code = NotFound desc = could not find container \"c1d972715d94dea6597d0906000593e88f5b3056e7b74ef2551dfbb18f82973d\": container with ID starting with c1d972715d94dea6597d0906000593e88f5b3056e7b74ef2551dfbb18f82973d not found: ID does not exist" Dec 08 10:52:24 crc kubenswrapper[4776]: I1208 10:52:24.418738 4776 scope.go:117] "RemoveContainer" containerID="a623beca03c8d66c1226393e1eef6a0f8a3a073224d28f23a3d1cfec199e754c" Dec 08 10:52:24 crc kubenswrapper[4776]: E1208 10:52:24.419115 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a623beca03c8d66c1226393e1eef6a0f8a3a073224d28f23a3d1cfec199e754c\": container with ID starting with a623beca03c8d66c1226393e1eef6a0f8a3a073224d28f23a3d1cfec199e754c not found: ID does not exist" containerID="a623beca03c8d66c1226393e1eef6a0f8a3a073224d28f23a3d1cfec199e754c" Dec 08 10:52:24 crc kubenswrapper[4776]: I1208 10:52:24.419140 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a623beca03c8d66c1226393e1eef6a0f8a3a073224d28f23a3d1cfec199e754c"} err="failed to get container status \"a623beca03c8d66c1226393e1eef6a0f8a3a073224d28f23a3d1cfec199e754c\": rpc error: code = NotFound desc = could not find container \"a623beca03c8d66c1226393e1eef6a0f8a3a073224d28f23a3d1cfec199e754c\": container with ID starting with a623beca03c8d66c1226393e1eef6a0f8a3a073224d28f23a3d1cfec199e754c not found: ID does not exist" Dec 08 10:52:24 crc kubenswrapper[4776]: I1208 10:52:24.988728 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:25 crc kubenswrapper[4776]: I1208 10:52:25.041228 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:26 crc kubenswrapper[4776]: I1208 10:52:26.364666 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a17929b-796a-4c03-8d86-debb138cb856" path="/var/lib/kubelet/pods/4a17929b-796a-4c03-8d86-debb138cb856/volumes" Dec 08 10:52:27 crc kubenswrapper[4776]: I1208 10:52:27.435024 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5hqfp"] Dec 08 10:52:27 crc kubenswrapper[4776]: I1208 10:52:27.435504 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5hqfp" podUID="0c650e4f-f976-4aeb-8eb0-7624e89adc4c" containerName="registry-server" containerID="cri-o://a43b65568dba5d5e5eb671edd83b1d1a0afec09ee738021374caf7acf3fb2b31" gracePeriod=2 Dec 08 10:52:27 crc kubenswrapper[4776]: I1208 10:52:27.990655 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.161986 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-catalog-content\") pod \"0c650e4f-f976-4aeb-8eb0-7624e89adc4c\" (UID: \"0c650e4f-f976-4aeb-8eb0-7624e89adc4c\") " Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.162053 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-utilities\") pod \"0c650e4f-f976-4aeb-8eb0-7624e89adc4c\" (UID: \"0c650e4f-f976-4aeb-8eb0-7624e89adc4c\") " Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.162183 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chs86\" (UniqueName: \"kubernetes.io/projected/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-kube-api-access-chs86\") pod \"0c650e4f-f976-4aeb-8eb0-7624e89adc4c\" (UID: \"0c650e4f-f976-4aeb-8eb0-7624e89adc4c\") " Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.162858 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-utilities" (OuterVolumeSpecName: "utilities") pod "0c650e4f-f976-4aeb-8eb0-7624e89adc4c" (UID: "0c650e4f-f976-4aeb-8eb0-7624e89adc4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.171428 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-kube-api-access-chs86" (OuterVolumeSpecName: "kube-api-access-chs86") pod "0c650e4f-f976-4aeb-8eb0-7624e89adc4c" (UID: "0c650e4f-f976-4aeb-8eb0-7624e89adc4c"). InnerVolumeSpecName "kube-api-access-chs86". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.262427 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c650e4f-f976-4aeb-8eb0-7624e89adc4c" (UID: "0c650e4f-f976-4aeb-8eb0-7624e89adc4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.265680 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.265964 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.266051 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chs86\" (UniqueName: \"kubernetes.io/projected/0c650e4f-f976-4aeb-8eb0-7624e89adc4c-kube-api-access-chs86\") on node \"crc\" DevicePath \"\"" Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.347637 4776 generic.go:334] "Generic (PLEG): container finished" podID="0c650e4f-f976-4aeb-8eb0-7624e89adc4c" containerID="a43b65568dba5d5e5eb671edd83b1d1a0afec09ee738021374caf7acf3fb2b31" exitCode=0 Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.347741 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5hqfp" Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.361631 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5hqfp" event={"ID":"0c650e4f-f976-4aeb-8eb0-7624e89adc4c","Type":"ContainerDied","Data":"a43b65568dba5d5e5eb671edd83b1d1a0afec09ee738021374caf7acf3fb2b31"} Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.361678 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5hqfp" event={"ID":"0c650e4f-f976-4aeb-8eb0-7624e89adc4c","Type":"ContainerDied","Data":"b0943536be243753bb7d4d0ff03152919d6e74155b3d0730c6f7b8b70f1e5641"} Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.361708 4776 scope.go:117] "RemoveContainer" containerID="a43b65568dba5d5e5eb671edd83b1d1a0afec09ee738021374caf7acf3fb2b31" Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.396462 4776 scope.go:117] "RemoveContainer" containerID="a9d2bb3aaf2eff2604a2fc06a4d48219452913735b64007ae061cb43fed70915" Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.400026 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5hqfp"] Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.412569 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5hqfp"] Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.418814 4776 scope.go:117] "RemoveContainer" containerID="6b2cced9f828dc8619a426319c2632a847e7bdc8ad04237f572a87b94a00e8d1" Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.473149 4776 scope.go:117] "RemoveContainer" containerID="a43b65568dba5d5e5eb671edd83b1d1a0afec09ee738021374caf7acf3fb2b31" Dec 08 10:52:28 crc kubenswrapper[4776]: E1208 10:52:28.474051 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a43b65568dba5d5e5eb671edd83b1d1a0afec09ee738021374caf7acf3fb2b31\": container with ID starting with a43b65568dba5d5e5eb671edd83b1d1a0afec09ee738021374caf7acf3fb2b31 not found: ID does not exist" containerID="a43b65568dba5d5e5eb671edd83b1d1a0afec09ee738021374caf7acf3fb2b31" Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.474208 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a43b65568dba5d5e5eb671edd83b1d1a0afec09ee738021374caf7acf3fb2b31"} err="failed to get container status \"a43b65568dba5d5e5eb671edd83b1d1a0afec09ee738021374caf7acf3fb2b31\": rpc error: code = NotFound desc = could not find container \"a43b65568dba5d5e5eb671edd83b1d1a0afec09ee738021374caf7acf3fb2b31\": container with ID starting with a43b65568dba5d5e5eb671edd83b1d1a0afec09ee738021374caf7acf3fb2b31 not found: ID does not exist" Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.474334 4776 scope.go:117] "RemoveContainer" containerID="a9d2bb3aaf2eff2604a2fc06a4d48219452913735b64007ae061cb43fed70915" Dec 08 10:52:28 crc kubenswrapper[4776]: E1208 10:52:28.474930 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9d2bb3aaf2eff2604a2fc06a4d48219452913735b64007ae061cb43fed70915\": container with ID starting with a9d2bb3aaf2eff2604a2fc06a4d48219452913735b64007ae061cb43fed70915 not found: ID does not exist" containerID="a9d2bb3aaf2eff2604a2fc06a4d48219452913735b64007ae061cb43fed70915" Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.474976 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9d2bb3aaf2eff2604a2fc06a4d48219452913735b64007ae061cb43fed70915"} err="failed to get container status \"a9d2bb3aaf2eff2604a2fc06a4d48219452913735b64007ae061cb43fed70915\": rpc error: code = NotFound desc = could not find container \"a9d2bb3aaf2eff2604a2fc06a4d48219452913735b64007ae061cb43fed70915\": container with ID starting with a9d2bb3aaf2eff2604a2fc06a4d48219452913735b64007ae061cb43fed70915 not found: ID does not exist" Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.475004 4776 scope.go:117] "RemoveContainer" containerID="6b2cced9f828dc8619a426319c2632a847e7bdc8ad04237f572a87b94a00e8d1" Dec 08 10:52:28 crc kubenswrapper[4776]: E1208 10:52:28.475572 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b2cced9f828dc8619a426319c2632a847e7bdc8ad04237f572a87b94a00e8d1\": container with ID starting with 6b2cced9f828dc8619a426319c2632a847e7bdc8ad04237f572a87b94a00e8d1 not found: ID does not exist" containerID="6b2cced9f828dc8619a426319c2632a847e7bdc8ad04237f572a87b94a00e8d1" Dec 08 10:52:28 crc kubenswrapper[4776]: I1208 10:52:28.475652 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b2cced9f828dc8619a426319c2632a847e7bdc8ad04237f572a87b94a00e8d1"} err="failed to get container status \"6b2cced9f828dc8619a426319c2632a847e7bdc8ad04237f572a87b94a00e8d1\": rpc error: code = NotFound desc = could not find container \"6b2cced9f828dc8619a426319c2632a847e7bdc8ad04237f572a87b94a00e8d1\": container with ID starting with 6b2cced9f828dc8619a426319c2632a847e7bdc8ad04237f572a87b94a00e8d1 not found: ID does not exist" Dec 08 10:52:30 crc kubenswrapper[4776]: I1208 10:52:30.360925 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c650e4f-f976-4aeb-8eb0-7624e89adc4c" path="/var/lib/kubelet/pods/0c650e4f-f976-4aeb-8eb0-7624e89adc4c/volumes" Dec 08 10:53:41 crc kubenswrapper[4776]: I1208 10:53:41.399602 4776 patch_prober.go:28] interesting pod/machine-config-daemon-jkmbn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 10:53:41 crc kubenswrapper[4776]: I1208 10:53:41.400164 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jkmbn" podUID="c9788ab1-1031-4103-a769-a4b3177c7268" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"